Science.gov

Sample records for model-based therapeutic correction

  1. Correction of placement error in EBL using model based method

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-10-01

    The main source of placement error in maskmaking using electron beam is charging. DISPLACE software provides a method to correct placement errors for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. Unknown physical parameters such as fogging can be found from calibration experiments. A test layout on a single calibration mask was used to calibrate physical parameters used in the correction model. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE, and the mask was fabricated and measured. A good correlation of the measured and predicted values of the correction all over the mask with the complex pattern confirmed the high accuracy of the charging placement error correction.

  2. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    PubMed

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  3. Model-based correction of the influence of body position on continuous segmental and hand-to-foot bioimpedance measurements.

    PubMed

    Medrano, Guillermo; Eitner, Frank; Walter, Marian; Leonhardt, Steffen

    2010-06-01

    Bioimpedance spectroscopy (BIS) is suitable for continuous monitoring of body water content. The combination of body posture and time is a well-known source of error, which limits the accuracy and therapeutic validity of BIS measurements. This study evaluates a model-based correction as a possible solution. For this purpose, an 11-cylinder model representing body impedance distribution is used. Each cylinder contains a nonlinear two-pool model to describe fluid redistribution due to changing body position and its influence on segmental and hand-to-foot (HF) bioimpedance measurements. A model-based correction of segmental (thigh) and HF measurements (Xitron Hydra 4200) in nine healthy human subjects (following a sequence of 7 min supine, 20 min standing, 40 min supine) has been evaluated. The model-based compensation algorithm represents a compromise between accuracy and simplicity, and reduces the influence of changes in body position on the measured extracellular resistance and extracellular fluid by up to 75 and 70%, respectively.

  4. Model-based motion correction of reduced field of view diffusion MRI data

    NASA Astrophysics Data System (ADS)

    Hering, Jan; Wolf, Ivo; Meinzer, Hans-Peter; Maier-Hein, Klaus H.

    2014-03-01

    In clinical settings, application of the most recent modelling techniques is usually unfeasible due to the limited acquisition time. Localised acquisitions enclosing only the object of interest by reducing the field-of-view (FOV) counteract the time limitation but pose new challenges to the subsequent processing steps like motion correction. We use datasets from the Human Connectome Project (HCP) to simulate head motion distorted reduced FOV acquisitions and present an evaluation of head motion correction approaches: the commonly used affine regis- tration onto an unweighted reference image guided by the mutual information (MI) metric and a model-based approach, which uses reference images computed from approximated tensor data to improve the performance of the MI metric. While the standard approach using the MI metric yields up to 15% outliers (error>5 mm) and a mean spatial error above 1.5 mm, the model-based approach reduces the number of outliers (1%) and the spatial error significantly (p<0.01). The behavior is also reflected by the visual analysis of the MI metric. The evaluation shows that the MI metric is of very limited use for reduced FOV data post-processing. The model-based approach has proven more suitable in this context.

  5. Physics Model Based Scatter Correction in Multi-source Interior Computed Tomography.

    PubMed

    Gong, Hao; Li, Bin; Jia, Xun; Gao, Guohua

    2017-08-17

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware based scatter correction methods for multi-source interior CT. Here, we propose a software based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio (CNR) at those ROIs increased by up to 44.3% and up to 19.7% respectively. The proposed physics model based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  6. Dynamic aberration correction for conformal optics using model-based wavefront sensorless adaptive optics

    NASA Astrophysics Data System (ADS)

    Han, Xinli; Dong, Bing; Li, Yan; Wang, Rui; Hu, Bin

    2016-10-01

    For missiles and airplanes with high Mach number, traditional spherical or flat window can cause a lot of air drag. Conformal window that follow the general contour of surrounding surface can substantially decrease air drag and extend operational range. However, the local shape of conformal window changes across the Field Of Regard (FOR), leading to time-varying FOR-dependent wavefront aberration and degraded image. So the correction of dynamic aberration is necessary. In this paper, model-based Wavefront Sensorless Adaptive Optics (WSAO) algorithm is investigated both by simulation and experiment for central-obscured pupil. The algorithm is proved to be effective and the correction accuracy of using DM modes is higher than Lukosz modes. For dynamic aberration in our system, the SR can be better than 0.8 when the change of looking angle is less than 2° after t seconds which is the time delay of the control system.

  7. Feasibility Study of Respiratory Motion Modeling Based Correction for MRI-Guided Intracardiac Interventional Procedures.

    PubMed

    Xu, Robert; Athavale, Prashant; Krahn, Philippa; Anderson, Kevan; Barry, Jennifer; Biswas, Labonny; Ramanan, Venkat; Yak, Nicolas; Pop, Mihaela; Wright, Graham A

    2015-12-01

    The purpose of this study is to improve the accuracy of interventional catheter guidance during intracardiac procedures. Specifically, the use of preprocedural magnetic resonance roadmap images for interventional guidance has limited anatomical accuracy due to intraprocedural respiratory motion of the heart. Therefore, we propose to build a novel respiratory motion model to compensate for this motion-induced error during magnetic resonance imaging (MRI)-guided procedures. We acquire 2-D real-time free-breathing images to characterize the respiratory motion, and build a smooth motion model via registration of 3-D prior roadmap images to the real-time images within a novel principal axes frame of reference. The model is subsequently used to correct the interventional catheter positions with respect to the anatomy of the heart. We demonstrate that the proposed modeling framework can lead to smoother motion models, and potentially lead to more accurate motion estimates. Specifically, MRI-guided intracardiac ablations were performed in six preclinical animal experiments. Then, from retrospective analysis, the proposed motion modeling technique showed the potential to achieve a 27% improvement in ablation targeting accuracy. The feasibility of a respiratory motion model-based correction framework has been successfully demonstrated. The improvement in ablation accuracy may lead to significant improvements in success rate and patient outcomes for MRI-guided intracardiac procedures.

  8. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  9. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  10. An efficient method for transfer cross coefficient approximation in model based optical proximity correction

    NASA Astrophysics Data System (ADS)

    Sabatier, Romuald; Fossati, Caroline; Bourennane, Salah; Di Giacomo, Antonio

    2008-10-01

    Model Based Optical Proximity Correction (MBOPC) is since a decade a widely used technique that permits to achieve resolutions on silicon layout smaller than the wave-length which is used in commercially-available photolithography tools. This is an important point, because masks dimensions are continuously shrinking. As for the current masks, several billions of segments have to be moved, and also, several iterations are needed to reach convergence. Therefore, fast and accurate algorithms are mandatory to perform OPC on a mask in a reasonably short time for industrial purposes. As imaging with an optical lithography system is similar to microscopy, the theory used in MBOPC is drawn from the works originally conducted for the theory of microscopy. Fourier Optics was first developed by Abbe to describe the image formed by a microscope and is often referred to as Abbe formulation. This is one of the best methods for optimizing illumination and is used in most of the commercially available lithography simulation packages. Hopkins method, developed later in 1951, is the best method for mask optimization. Consequently, Hopkins formulation, widely used for partially coherent illumination, and thus for lithography, is present in most of the commercially available OPC tools. This formulation has the advantage of a four-way transmission function independent of the mask layout. The values of this function, called Transfer Cross Coefficients (TCC), describe the illumination and projection pupils. Commonly-used algorithms, involving TCC of Hopkins formulation to compute aerial images during MBOPC treatment, are based on TCC decomposition into its eigenvectors using matricization and the well-known Singular Value Decomposition (SVD) tool. These techniques that use numerical approximation and empirical determination of the number of eigenvectors taken into account, could not match reality and lead to an information loss. They also remain highly runtime consuming. We propose an

  11. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  12. Evaluation of metal artifacts in MVCT systems using a model based correction method.

    PubMed

    Paudel, M R; Mackenzie, M; Fallone, B G; Rathee, S

    2012-10-01

    To evaluate the performance of a model based image reconstruction method in reducing metal artifacts in the megavoltage computed tomography (MVCT) images of a phantom representing bilateral hip prostheses and to compare with the filtered-backprojection (FBP) technique. An iterative maximum likelihood polychromatic algorithm for CT (IMPACT) is used with an additional model for the pair∕triplet production process and the energy dependent response of the detectors. The beam spectra for an in-house bench-top and TomoTherapy™ MVCTs are modeled for use in IMPACT. The empirical energy dependent response of detectors is calculated using a constrained optimization technique that predicts the measured attenuation of the beam by various thicknesses (0-24 cm) of solid water slabs. A cylindrical (19.1 cm diameter) plexiglass phantom containing various cylindrical inserts of relative electron densities 0.295-1.695 positioned between two steel rods (2.7 cm diameter) is scanned in the bench-top MVCT that utilizes the bremsstrahlung radiation from a 6 MeV electron beam passed through 4 cm solid water on the Varian Clinac 2300C and in the imaging beam of the TomoTherapy™ MVCT. The FBP technique in bench-top MVCT reconstructs images from raw signal normalized to air scan and corrected for beam hardening using a uniform plexiglass cylinder (20 cm diameter). The IMPACT starts with a FBP reconstructed seed image and reconstructs the final image in 150 iterations. In both MVCTs, FBP produces visible dark shading in the image connecting the steel rods. In the IMPACT reconstructed images this shading is nearly removed and the uniform background is restored. The average attenuation coefficients of the inserts and the background are very close to the corresponding values in the absence of the steel inserts. In the FBP images of the bench-top MVCT, the shading causes 4%-9.5% underestimation of electron density at the central inserts with an average of (6.3 ± 1.8)% for the range of

  13. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance.

  14. Retrospective study comparing model-based deformation correction to intraoperative magnetic resonance imaging for image-guided neurosurgery.

    PubMed

    Luo, Ma; Frisken, Sarah F; Weis, Jared A; Clements, Logan W; Unadkat, Prashin; Thompson, Reid C; Golby, Alexandra J; Miga, Michael I

    2017-07-01

    Brain shift during tumor resection compromises the spatial validity of registered preoperative imaging data that is critical to image-guided procedures. One current clinical solution to mitigate the effects is to reimage using intraoperative magnetic resonance (iMR) imaging. Although iMR has demonstrated benefits in accounting for preoperative-to-intraoperative tissue changes, its cost and encumbrance have limited its widespread adoption. While iMR will likely continue to be employed for challenging cases, a cost-effective model-based brain shift compensation strategy is desirable as a complementary technology for standard resections. We performed a retrospective study of [Formula: see text] tumor resection cases, comparing iMR measurements with intraoperative brain shift compensation predicted by our model-based strategy, driven by sparse intraoperative cortical surface data. For quantitative assessment, homologous subsurface targets near the tumors were selected on preoperative MR and iMR images. Once rigidly registered, intraoperative shift measurements were determined and subsequently compared to model-predicted counterparts as estimated by the brain shift correction framework. When considering moderate and high shift ([Formula: see text], [Formula: see text] measurements per case), the alignment error due to brain shift reduced from [Formula: see text] to [Formula: see text], representing [Formula: see text] correction. These first steps toward validation are promising for model-based strategies.

  15. Shape-dependent dose margin correction using model-based mask data preparation

    NASA Astrophysics Data System (ADS)

    Kimura, Yasuki; Yamamoto, Ryuuji; Kubota, Takao; Kouno, Kenji; Matsushita, Shohei; Hagiwara, Kazuyuki; Hara, Daisuke

    2012-11-01

    Dose Margin has always been known to be a critical factor in mask making. This paper describes why the issue is far more critical than ever before with the 20-nm logic node and beyond using ArF Immersion lithography. Model-Based Mask Data Preparation (MB-MDP) had been presented [references] to show shot count improvements for these complex masks. This paper describes that MBMDP also improves the dose margin. The improvement predicted with theoretical simulation with D2S is confirmed by the results of real mask written by JBX-3200MV (JEOL) by HOYA.

  16. Correcting encoder interpolation error on the Green Bank Telescope using an iterative model based identification algorithm

    NASA Astrophysics Data System (ADS)

    Franke, Timothy; Weadon, Tim; Ford, John; Garcia-Sanz, Mario

    2015-10-01

    Various forms of measurement errors limit telescope tracking performance in practice. A new method for identifying the correcting coefficients for encoder interpolation error is developed. The algorithm corrects the encoder measurement by identifying a harmonic model of the system and using that model to compute the necessary correction parameters. The approach improves upon others by explicitly modeling the unknown dynamics of the structure and controller and by not requiring a separate system identification to be performed. Experience gained from pin-pointing the source of encoder error on the Green Bank Radio Telescope (GBT) is presented. Several tell-tale indicators of encoder error are discussed. Experimental data from the telescope, tested with two different encoders, are presented. Demonstration of the identification methodology on the GBT as well as details of its implementation are discussed. A root mean square tracking error reduction from 0.68 arc seconds to 0.21 arc sec was achieved by changing encoders and was further reduced to 0.10 arc sec with the calibration algorithm. In particular, the ubiquity of this error source is shown and how, by careful correction, it is possible to go beyond the advertised accuracy of an encoder.

  17. Model-based correction of velocity measurements in navigated 3-D ultrasound imaging during neurosurgical interventions.

    PubMed

    Iversen, Daniel Hoyer; Lindseth, Frank; Unsgaard, Geirmund; Torp, Hans; Lovstakken, Lasse

    2013-09-01

    In neurosurgery, information of blood flow is important to identify and avoid damage to important vessels. Three-dimensional intraoperative ultrasound color-Doppler imaging has proven useful in this respect. However, due to Doppler angle-dependencies and the complexity of the vascular architecture, clinical valuable 3-D information of flow direction and velocity is currently not available. In this work, we aim to correct for angle-dependencies in 3-D flow images based on a geometric model of the neurovascular tree generated on-the-fly from free-hand 2-D imaging and an accurate position sensor system. The 3-D vessel model acts as a priori information of vessel orientation used to angle-correct the Doppler measurements, as well as provide an estimate of the average flow direction. Based on the flow direction we were also able to do aliasing correction to approximately double the measurable velocity range. In vitro experiments revealed a high accuracy and robustness for estimating the mean direction of flow. Accurate angle-correction of axial velocities were possible given a sufficient beam-to-flow angle for at least parts of a vessel segment . In vitro experiments showed an absolute relative bias of 9.5% for a challenging low-flow scenario. The method also showed promising results in vivo, improving the depiction of flow in the distal branches of intracranial aneurysms and the feeding arteries of an arteriovenous malformation. Careful inspection by an experienced surgeon confirmed the correct flow direction for all in vivo examples.

  18. A model-based scatter artifacts correction for cone beam CT

    SciTech Connect

    Zhao, Wei; Zhu, Jun; Wang, Luyao; Vernekohl, Don; Xing, Lei

    2016-04-15

    Purpose: Due to the increased axial coverage of multislice computed tomography (CT) and the introduction of flat detectors, the size of x-ray illumination fields has grown dramatically, causing an increase in scatter radiation. For CT imaging, scatter is a significant issue that introduces shading artifact, streaks, as well as reduced contrast and Hounsfield Units (HU) accuracy. The purpose of this work is to provide a fast and accurate scatter artifacts correction algorithm for cone beam CT (CBCT) imaging. Methods: The method starts with an estimation of coarse scatter profiles for a set of CBCT data in either image domain or projection domain. A denoising algorithm designed specifically for Poisson signals is then applied to derive the final scatter distribution. Qualitative and quantitative evaluations using thorax and abdomen phantoms with Monte Carlo (MC) simulations, experimental Catphan phantom data, and in vivo human data acquired for a clinical image guided radiation therapy were performed. Scatter correction in both projection domain and image domain was conducted and the influences of segmentation method, mismatched attenuation coefficients, and spectrum model as well as parameter selection were also investigated. Results: Results show that the proposed algorithm can significantly reduce scatter artifacts and recover the correct HU in either projection domain or image domain. For the MC thorax phantom study, four-components segmentation yields the best results, while the results of three-components segmentation are still acceptable. The parameters (iteration number K and weight β) affect the accuracy of the scatter correction and the results get improved as K and β increase. It was found that variations in attenuation coefficient accuracies only slightly impact the performance of the proposed processing. For the Catphan phantom data, the mean value over all pixels in the residual image is reduced from −21.8 to −0.2 HU and 0.7 HU for projection

  19. A model-based scatter artifacts correction for cone beam CT

    PubMed Central

    Zhao, Wei; Vernekohl, Don; Zhu, Jun; Wang, Luyao; Xing, Lei

    2016-01-01

    Purpose: Due to the increased axial coverage of multislice computed tomography (CT) and the introduction of flat detectors, the size of x-ray illumination fields has grown dramatically, causing an increase in scatter radiation. For CT imaging, scatter is a significant issue that introduces shading artifact, streaks, as well as reduced contrast and Hounsfield Units (HU) accuracy. The purpose of this work is to provide a fast and accurate scatter artifacts correction algorithm for cone beam CT (CBCT) imaging. Methods: The method starts with an estimation of coarse scatter profiles for a set of CBCT data in either image domain or projection domain. A denoising algorithm designed specifically for Poisson signals is then applied to derive the final scatter distribution. Qualitative and quantitative evaluations using thorax and abdomen phantoms with Monte Carlo (MC) simulations, experimental Catphan phantom data, and in vivo human data acquired for a clinical image guided radiation therapy were performed. Scatter correction in both projection domain and image domain was conducted and the influences of segmentation method, mismatched attenuation coefficients, and spectrum model as well as parameter selection were also investigated. Results: Results show that the proposed algorithm can significantly reduce scatter artifacts and recover the correct HU in either projection domain or image domain. For the MC thorax phantom study, four-components segmentation yields the best results, while the results of three-components segmentation are still acceptable. The parameters (iteration number K and weight β) affect the accuracy of the scatter correction and the results get improved as K and β increase. It was found that variations in attenuation coefficient accuracies only slightly impact the performance of the proposed processing. For the Catphan phantom data, the mean value over all pixels in the residual image is reduced from −21.8 to −0.2 HU and 0.7 HU for projection

  20. Effects of model-based physiological noise correction on default mode network anti-correlations and correlations.

    PubMed

    Chang, Catie; Glover, Gary H

    2009-10-01

    Previous studies have reported that the spontaneous, resting-state time course of the default-mode network is negatively correlated with that of the "task-positive network", a collection of regions commonly recruited in demanding cognitive tasks. However, all studies of negative correlations between the default-mode and task-positive networks have employed some form of normalization or regression of the whole-brain average signal ("global signal"); these processing steps alter the time series of voxels in an uninterpretable manner as well as introduce spurious negative correlations. Thus, the extent of negative correlations with the default mode network without global signal removal has not been well characterized, and it is has recently been hypothesized that the apparent negative correlations in many of the task-positive regions could be artifactually induced by global signal pre-processing. The present study aimed to examine negative and positive correlations with the default-mode network when model-based corrections for respiratory and cardiac noise are applied in lieu of global signal removal. Physiological noise correction consisted of (1) removal of time-locked cardiac and respiratory artifacts using RETROICOR (Glover, G.H., Li, T.Q., Ress, D., 2000. Image-based method for retrospective correction of physiological motion effects in fMRI: RETROICOR. Magn. Reson. Med. 44, 162-167), and (2) removal of low-frequency respiratory and heart rate variations by convolving these waveforms with pre-determined transfer functions (Birn et al., 2008; Chang et al., 2009) and projecting the resulting two signals out of the data. It is demonstrated that negative correlations between the default-mode network and regions of the task-positive network are present in the majority of individual subjects both with and without physiological noise correction. Physiological noise correction increased the spatial extent and magnitude of negative correlations, yielding negative

  1. Model-based Corrections to Observed Azimuth and Slowness Deviations from a Dipping Mohorovicic Discontinuity

    NASA Astrophysics Data System (ADS)

    Flanagan, M. P.; Myers, S.; Simmons, N. A.

    2012-12-01

    Back azimuth and slowness anomalies observed at seismic arrays can be used to constrain local and distant structural and propagation effects in the Earth. Observations of large systematic deviations in both azimuth and slowness measured for several P phases (i.e., Pg, Pn, P, PKP) recorded at several IMS arrays show a characteristic sinusoidal pattern when plotted as a function of theoretical back azimuth. These deviations are often interpreted as the affect of the wavefield being systematically bent by refraction from a dipping velocity structure beneath the array, most likely a dipping Moho. We develop a model-based technique that simultaneously fits back azimuth and slowness observations with a ray-based prediction that incorporates a dipping layer defined by its strike and dip. Because the azimuth and slowness deviations both vary as a function of true azimuth, fitting both residuals jointly will give a more consistent calibration for the array. The technique is used to fit over 9900 observations at CMAR from a global distribution of well-located seismic events. Under the assumption that the dipping layer is the Moho with mantle velocity 8.04 km/sec and crustal velocity 6.2 km/sec, we estimate that Moho strike and dip under the CMAR array are 192.6° and 18.3°, respectively. When the trend of the Moho is removed from the back azimuth and slowness residuals, both the sinuous trend and variations with predicted slowness are mitigated. While a dipping interface model does not account for all of the discrepancy between observed and predicted back azimuth and slowness anomalies, and additional calibration whether empirical or model based should be pursued, this technique is a good first step in the calibration procedure for arrays exhibiting sinusoidal residual trends.

  2. Probabilistic model based error correction in a set of various mutant sequences analyzed by next-generation sequencing.

    PubMed

    Aita, Takuyo; Ichihashi, Norikazu; Yomo, Tetsuya

    2013-12-01

    To analyze the evolutionary dynamics of a mutant population in an evolutionary experiment, it is necessary to sequence a vast number of mutants by high-throughput (next-generation) sequencing technologies, which enable rapid and parallel analysis of multikilobase sequences. However, the observed sequences include many errors of base call. Therefore, if next-generation sequencing is applied to analysis of a heterogeneous population of various mutant sequences, it is necessary to discriminate between true bases as point mutations and errors of base call in the observed sequences, and to subject the sequences to error-correction processes. To address this issue, we have developed a novel method of error correction based on the Potts model and a maximum a posteriori probability (MAP) estimate of its parameters corresponding to the "true sequences". Our method of error correction utilizes (1) the "quality scores" which are assigned to individual bases in the observed sequences and (2) the neighborhood relationship among the observed sequences mapped in sequence space. The computer experiments of error correction of artificially generated sequences supported the effectiveness of our method, showing that 50-90% of errors were removed. Interestingly, this method is analogous to a probabilistic model based method of image restoration developed in the field of information engineering.

  3. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  4. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-12-10

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system.

  5. A fully model-based MPC solution including VSB shot dose assignment and shape correction

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Reddy, Murali; Durvasula, Bhardwaj

    2015-10-01

    The value of using multiple dose levels for individual shots on VSB (Variable Shaped Beam) mask writers has been demonstrated earlier [1][2]. The main advantage of modulating dose on a per shot basis is the fact that higher dose levels can be used selectively for critical features while other areas of the mask with non-critical feature types can be exposed at lower dose levels. This reduces the amount of backscattering and mask write time penalty compared to a global overdose-undersize approach. While dose assignment to certain polygons or parts of polygons (VSB shots) can easily be accomplished via DRC rules on layers with limited shape variations like contact or VIA layers, it can be challenging to come up with consistent rules for layers consisting of a very broad range of shapes, generally found on metal layers. This work introduces a method for fully model-based modulation of shot dose for VSB machines supporting between two and eight dose levels and demonstrates results achieved with this method.

  6. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking

    PubMed Central

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  7. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-11-06

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved.

  8. A systematic review of model-based economic evaluations of diagnostic and therapeutic strategies for lower extremity artery disease.

    PubMed

    Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L

    2014-01-01

    Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.

  9. A model-based correction for outcome reporting bias in meta-analysis.

    PubMed

    Copas, John; Dwan, Kerry; Kirkham, Jamie; Williamson, Paula

    2014-04-01

    It is often suspected (or known) that outcomes published in medical trials are selectively reported. A systematic review for a particular outcome of interest can only include studies where that outcome was reported and so may omit, for example, a study that has considered several outcome measures but only reports those giving significant results. Using the methodology of the Outcome Reporting Bias (ORB) in Trials study of (Kirkham and others, 2010. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. British Medical Journal 340, c365), we suggest a likelihood-based model for estimating the effect of ORB on confidence intervals and p-values in meta-analysis. Correcting for bias has the effect of moving estimated treatment effects toward the null and hence more cautious assessments of significance. The bias can be very substantial, sometimes sufficient to completely overturn previous claims of significance. We re-analyze two contrasting examples, and derive a simple fixed effects approximation that can be used to give an initial estimate of the effect of ORB in practice.

  10. A photon counting detector model based on increment matrices to simulate statistically correct detector signals

    NASA Astrophysics Data System (ADS)

    Faby, Sebastian; Maier, Joscha; Simons, David; Schlemmer, Heinz-Peter; Lell, Michael; Kachelrieß, Marc

    2015-03-01

    We present a novel increment matrix concept to simulate the correlations in an energy-selective photon counting detector. Correlations between the energy bins of neighboring detector pixels are introduced by scattered and fluorescence photons, together with the broadening of the induced charge clouds as they travel towards the electrodes, leading to charge sharing. It is important to generate statistically correct detector signals for the different energy bins to be able to realistically assess the detector's performance in various tasks, e.g. material decomposition. Our increment matrix concept describes the counter increases in neighboring pixels on a single event level. Advantages of our model are the fact that much less random numbers are required than simulating single photons and that the increment matrices together with their probabilities have to be generated only once and can be stored for later use. The different occurring increment matrix sets and the corresponding probabilities are simulated using an analytic model of the photon-matter-interactions based on the photoelectric effect and Compton scattering, and the charge cloud drift, featuring thermal diffusion and Coulomb expansion of the charge cloud. The results obtained with this model are evaluated in terms of the spectral response for different detector geometries and the resulting energy bin sensitivity. Comparisons to published measured data and a parameterized detector model show both a good qualitative and quantitative agreement. We also studied the resulting covariance of reconstructed energy bin images.

  11. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  12. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  13. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  14. Efficient model-based dummy-fill OPC correction flow for deep sub-micron technology nodes

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Salama, Mohamed

    2014-09-01

    Dummy fill insertion is a necessary step in modern semiconductor technologies to achieve homogeneous pattern density per layer. This benefits several fabrication process steps including but not limited to Chemical Mechanical Polishing (CMP), Etching, and Packaging. As the technology keeps shrinking, fill shapes become more challenging to pattern and require aggressive model based optical proximity correction (MBOPC) to achieve better design fidelity. MBOPC on Fill is a challenge to mask data prep runtime and final mask shot count which would affect the total turnaround time (TAT) and mask cost. In our work, we introduce a novel flow that achieves a robust and computationally efficient fill handling methodology during mask data prep, which will keep both the runtime and shot count within their acceptable levels. In this flow, fill shapes undergo a smart MBOPC step which improves the final wafer printing quality and topography uniformity without degrading the final shot count or the OPC cycle runtime. This flow is tested on both front end of line (FEOL) layers and backend of line (BEOL) layers, and results in an improved final printing of the fill patterns while consuming less than 2% of the full MBOPC flow runtime.

  15. Validation of model-based brain shift correction in neurosurgery via intraoperative magnetic resonance imaging: preliminary results

    NASA Astrophysics Data System (ADS)

    Luo, Ma; Frisken, Sarah F.; Weis, Jared A.; Clements, Logan W.; Unadkat, Prashin; Thompson, Reid C.; Golby, Alexandra J.; Miga, Michael I.

    2017-03-01

    The quality of brain tumor resection surgery is dependent on the spatial agreement between preoperative image and intraoperative anatomy. However, brain shift compromises the aforementioned alignment. Currently, the clinical standard to monitor brain shift is intraoperative magnetic resonance (iMR). While iMR provides better understanding of brain shift, its cost and encumbrance is a consideration for medical centers. Hence, we are developing a model-based method that can be a complementary technology to address brain shift in standard resections, with resource-intensive cases as referrals for iMR facilities. Our strategy constructs a deformation `atlas' containing potential deformation solutions derived from a biomechanical model that account for variables such as cerebrospinal fluid drainage and mannitol effects. Volumetric deformation is estimated with an inverse approach that determines the optimal combinatory `atlas' solution fit to best match measured surface deformation. Accordingly, preoperative image is updated based on the computed deformation field. This study is the latest development to validate our methodology with iMR. Briefly, preoperative and intraoperative MR images of 2 patients were acquired. Homologous surface points were selected on preoperative and intraoperative scans as measurement of surface deformation and used to drive the inverse problem. To assess the model accuracy, subsurface shift of targets between preoperative and intraoperative states was measured and compared to model prediction. Considering subsurface shift above 3 mm, the proposed strategy provides an average shift correction of 59% across 2 cases. While further improvements in both the model and ability to validate with iMR are desired, the results reported are encouraging.

  16. Isolating Source Information in mb and Ms with Model-Based Corrections: New mb Versus Ms Discriminant Formulations

    DTIC Science & Technology

    2010-09-01

    September 20 I 0, Orlando, FL. Volume I pp 361 - 369. 14. ABSTRACT Path and signal processing corrections made to amplitudes give magnitudes mb and Ms...source correction in addition to path and signal processing corrections under the null (HO) hypothesis that a �cismic event is a single-point fully...mathematically included in amplitude corrections. We develop a mathematical model to capture these near source effects as random (unknown) giving an

  17. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    PubMed Central

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  18. Evaluation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound

    PubMed Central

    Clements, Logan W.; Collins, Jarrod A.; Weis, Jared A.; Simpson, Amber L.; Adams, Lauryn B.; Jarnagin, William R.; Miga, Michael I.

    2016-01-01

    Abstract. Soft-tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface-based metrics, and subsurface validation has largely been performed via phantom experiments. The proposed method involves the analysis of two deformation-correction algorithms for open hepatic image-guided surgery systems via subsurface targets digitized with tracked intraoperative ultrasound (iUS). Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration and for use in retrospective deformation-correction algorithms. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. Mean closest-point distances between the feature contours delineated in the iUS images and corresponding three-dimensional anatomical model generated from preoperative tomograms were computed to quantify the extent to which the deformation-correction algorithms improved registration accuracy. The results for six patients, including eight anatomical targets, indicate that deformation correction can facilitate reduction in target error of ∼52%. PMID:27081664

  19. Model-based mask data preparation (MB-MDP) for ArF and EUV mask process correction

    NASA Astrophysics Data System (ADS)

    Hagiwara, Kazuyuki; Bork, Ingo; Fujimura, Aki

    2011-05-01

    Using Model-Based Mask Data Preparation (MB-MDP) complex masks with complex sub-resolution assist features (SRAFs) can be written in practical write times with today's leading-edge production VSB machines by allowing overlapping VSB shots. This simulation-based approach reduces shot count by taking advantage of the added flexibility in being able to overlap shots. The freedom to overlap shots, it turns out, also increases mask fidelity, CDU on the mask, and CDU on the wafer by writing sub-100nm mask features more accurately, and with better dose margin. This paper describes how overlapping shots enhance mask and wafer quality for various sub-100nm features on ArF masks. In addition, this paper describes how EUV mask accuracy can be enhanced uniquely by allowing overlapping shots.

  20. Better numerical model for shape-dependent dose margin correction using model-based mask data preparation

    NASA Astrophysics Data System (ADS)

    Kimura, Yasuki; Kubota, Takao; Kouno, Kenji; Hagiwara, Kazuyuki; Matsushita, Shohei; Hara, Daisuke

    2013-06-01

    For the mask making community, maintaining acceptable dose margin has been recognized as a critical factor in the mask-making process. This is expected to be more critical for 20nm logic node masks and beyond. To deal with this issue, model-based mask data preparation (MB-MDP) had been presented as a useful method to obtain sufficient dose margin for these complex masks, in addition to reducing shot count. When the MB-MDP approach is applied in the actual mask production, the prediction of the dose margin and the CD in the finished mask is essential. This paper describes an improved model of mask process which predicts dose margin and CD in finished masks better compared with the single Gaussian model presented in previous work. The better predictions of this simple numerical model are confirmed with simulation by D2S and actual mask written by HOYA using JEOL JBX-3200MV.

  1. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2014-07-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two datasets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM; and (2) a set of eighteen hydrological-topographic descriptors based on the corrected SRTM DEM. The hydrological-topographic description was generated by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (a.s.l.) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND dataset was done by in situ hydrological description of 110 km of walking trails also available in this dataset. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; and the datasets of hydrological features based on topographic modelling is undoubtedly appropriate for ecological modelling and an important contribution for environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the polygons selected for deforestation correction are available at http://ppbio.inpa.gov.br/knb/metacat/naman.317.3/ppbio; the set of hydrological-topographic descriptors is available at

  2. SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data

    SciTech Connect

    Chen, M; Jiang, S; Lu, W

    2015-06-15

    Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data, as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.

  3. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  4. Correction for scatter and septal penetration using convolution subtraction methods and model-based compensation in 123I brain SPECT imaging-a Monte Carlo study.

    PubMed

    Larsson, Anne; Ljungberg, Michael; Mo, Susanna Jakobson; Riklund, Katrine; Johansson, Lennart

    2006-11-21

    Scatter and septal penetration deteriorate contrast and quantitative accuracy in single photon emission computed tomography (SPECT). In this study four different correction techniques for scatter and septal penetration are evaluated for 123I brain SPECT. One of the methods is a form of model-based compensation which uses the effective source scatter estimation (ESSE) for modelling scatter, and collimator-detector response (CDR) including both geometric and penetration components. The other methods, which operate on the 2D projection images, are convolution scatter subtraction (CSS) and two versions of transmission dependent convolution subtraction (TDCS), one of them proposed by us. This method uses CSS for correction for septal penetration, with a separate kernel, and TDCS for scatter correction. The corrections are evaluated for a dopamine transporter (DAT) study and a study of the regional cerebral blood flow (rCBF), performed with 123I. The images are produced using a recently developed Monte Carlo collimator routine added to the program SIMIND which can include interactions in the collimator. The results show that the method included in the iterative reconstruction is preferable to the other methods and that the new TDCS version gives better results compared with the other 2D methods.

  5. Correction for scatter and septal penetration using convolution subtraction methods and model-based compensation in 123I brain SPECT imaging—a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Larsson, Anne; Ljungberg, Michael; Jakobson Mo, Susanna; Riklund, Katrine; Johansson, Lennart

    2006-11-01

    Scatter and septal penetration deteriorate contrast and quantitative accuracy in single photon emission computed tomography (SPECT). In this study four different correction techniques for scatter and septal penetration are evaluated for 123I brain SPECT. One of the methods is a form of model-based compensation which uses the effective source scatter estimation (ESSE) for modelling scatter, and collimator-detector response (CDR) including both geometric and penetration components. The other methods, which operate on the 2D projection images, are convolution scatter subtraction (CSS) and two versions of transmission dependent convolution subtraction (TDCS), one of them proposed by us. This method uses CSS for correction for septal penetration, with a separate kernel, and TDCS for scatter correction. The corrections are evaluated for a dopamine transporter (DAT) study and a study of the regional cerebral blood flow (rCBF), performed with 123I. The images are produced using a recently developed Monte Carlo collimator routine added to the program SIMIND which can include interactions in the collimator. The results show that the method included in the iterative reconstruction is preferable to the other methods and that the new TDCS version gives better results compared with the other 2D methods.

  6. Model-based correction of tissue compression for tracked ultrasound in soft tissue image-guided surgery.

    PubMed

    Pheiffer, Thomas S; Thompson, Reid C; Rucker, Daniel C; Simpson, Amber L; Miga, Michael I

    2014-04-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm.

  7. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  8. MODEL-BASED CORRECTION OF TISSUE COMPRESSION FOR TRACKED ULTRASOUND IN SOFT TISSUE IMAGE-GUIDED SURGERY

    PubMed Central

    Pheiffer, Thomas S.; Thompson, Reid C.; Rucker, Daniel C.; Simpson, Amber L.; Miga, Michael I.

    2014-01-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm. PMID:24412172

  9. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  10. Kidney Stone Volume Estimation from Computerized Tomography Images Using a Model Based Method of Correcting for the Point Spread Function

    PubMed Central

    Duan, Xinhui; Wang, Jia; Qu, Mingliang; Leng, Shuai; Liu, Yu; Krambeck, Amy; McCollough, Cynthia

    2014-01-01

    Purpose We propose a method to improve the accuracy of volume estimation of kidney stones from computerized tomography images. Materials and Methods The proposed method consisted of 2 steps. A threshold equal to the average of the computerized tomography number of the object and the background was first applied to determine full width at half maximum volume. Correction factors were then applied, which were precalculated based on a model of a sphere and a 3-dimensional Gaussian point spread function. The point spread function was measured in a computerized tomography scanner to represent the response of the scanner to a point-like object. Method accuracy was validated using 6 small cylindrical phantoms with 2 volumes of 21.87 and 99.9 mm3, and 3 attenuations, respectively, and 76 kidney stones with a volume range of 6.3 to 317.4 mm3. Volumes estimated by the proposed method were compared with full width at half maximum volumes. Results The proposed method was significantly more accurate than full width at half maximum volume (p <0.0001). The magnitude of improvement depended on stone volume with smaller stones benefiting more from the method. For kidney stones 10 to 20 mm3 in volume the average improvement in accuracy was the greatest at 19.6%. Conclusions The proposed method achieved significantly improved accuracy compared with threshold methods. This may lead to more accurate stone management. PMID:22819107

  11. Therapeutic strategies to correct proteostasis-imbalance in chronic obstructive lung diseases.

    PubMed

    Bodas, M; Tran, I; Vij, N

    2012-08-01

    Proteostasis is a critical cellular homeostasis mechanism that regulates the concentration of all cellular proteins by controlling protein- synthesis, processing and degradation. This includes protein-conformation, binding interactions and sub-cellular localization. Environmental, genetic or age-related pathogenetic factors can modulate the proteostasis (proteostasis-imbalance) through transcriptional, translational and post-translational changes that trigger the development of several complex diseases. Although these factors are known to be involved in pathogenesis of chronic obstructive pulmonary disease (COPD), the role of proteostasis mechanisms in COPD is scarcely investigated. As a proof of concept, our recent data reveals a novel role of proteostasis-imbalance in COPD pathogenesis. Briefly, cigarette- and biomass- smoke induced proteostasis-imbalance may aggravate chronic inflammatory-oxidative stress and/or protease-anti-protease imbalance resulting in pathogenesis of severe emphysema. In contrast, pathogenesis of other chronic lung diseases like ΔF508-cystic fibrosis (CF), α1-anti-trypsin-deficiency (α-1 ATD) and pulmonary fibrosis (PF) is regulated by other proteostatic mechanisms, involving the degradation of misfolded proteins (ΔF508-CFTR/α1-AT- Z variant) or regulating the concentration of signaling proteins (such as TGF-β1) by the ubiquitin-proteasome system (UPS). The therapeutic strategies to correct proteostasis-imbalance in misfolded protein disorders such as ΔF508-CF have been relatively well studied and involve strategies that rescue functional CFTR protein to treat the underlying cause of the disease. While in the case of COPD-emphysema and/or PF, identification of novel proteostasis-regulators that can control inflammatory-oxidative stress and/or protease-anti-protease balance is warranted.

  12. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in the NOTCH3 gene. NOTCH3 mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specific NOTCH3 exons. Selection of these exons was achieved using in silico studies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targeted NOTCH3 exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations

  13. C60 Fullerene as Promising Therapeutic Agent for the Prevention and Correction of Skeletal Muscle Functioning at Ischemic Injury

    NASA Astrophysics Data System (ADS)

    Nozdrenko, D. M.; Zavodovskyi, D. O.; Matvienko, T. Yu.; Zay, S. Yu.; Bogutska, K. I.; Prylutskyy, Yu. I.; Ritter, U.; Scharff, P.

    2017-02-01

    The therapeutic effect of pristine C60 fullerene aqueous colloid solution (C60FAS) on the functioning of the rat soleus muscle at ischemic injury depending on the time of the general pathogenesis of muscular system and method of administration C60FAS in vivo was investigated. It was found that intravenous administration of C60FAS is the optimal for correction of speed macroparameters of contraction for ischemic muscle damage. At the same time, intramuscular administration of C60FAS shows pronounced protective effect in movements associated with the generation of maximum force responses or prolonged contractions, which increase the muscle fatigue level. Analysis of content concentration of creatine phosphokinase and lactate dehydrogenase enzymes in the blood of experimental animals indicates directly that C60FAS may be a promising therapeutic agent for the prevention and correction of ischemic-damaged skeletal muscle function.

  14. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    SciTech Connect

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for building parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the

  15. Correction

    NASA Astrophysics Data System (ADS)

    1995-04-01

    Seismic images of the Brooks Range, Arctic Alaska, reveal crustal-scale duplexing: Correction Geology, v. 23, p. 65 68 (January 1995) The correct Figure 4A, for the loose insert, is given here. See Figure 4A below. Corrected inserts will be available to those requesting copies of the article from the senior author, Gary S. Fuis, U.S. Geological Survey, 345 Middlefield Road, Menlo Park, CA 94025. Figure 4A. P-wave velocity model of Brooks Range region (thin gray contours) with migrated wide-angle reflections (heavy red lines) and migreated vertical-incidence reflections (short black lines) superimposed. Velocity contour interval is 0.25 km/s; 4,5, and 6 km/s contours are labeled. Estimated error in velocities is one contour interval. Symbols on faults shown at top are as in Figure 2 caption.

  16. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera.

    PubMed

    Holstensson, M; Erlandsson, K; Poludniowski, G; Ben-Haim, S; Hutton, B F

    2015-04-21

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of (99m)Tc and (123)I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of (99m)Tc and (123)I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both (99m)Tc and (123)I, results show that the estimated spatial distribution of events from (99m)Tc in the (99m)Tc photopeak energy window is very similar to that measured in a single (99m)Tc phantom study. The extracted images of primary events display increased cold lesion contrasts for both (99m)Tc and (123)I.

  17. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera

    NASA Astrophysics Data System (ADS)

    Holstensson, M.; Erlandsson, K.; Poludniowski, G.; Ben-Haim, S.; Hutton, B. F.

    2015-04-01

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of 99mTc and 123I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of 99mTc and 123I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both 99mTc and 123I, results show that the estimated spatial distribution of events from 99mTc in the 99mTc photopeak energy window is very similar to that measured in a single 99mTc phantom study. The extracted images of primary events display increased cold lesion contrasts for both 99mTc and 123I.

  18. Corrections.

    PubMed

    2015-07-01

    Lai Y-S, Biedermann P, Ekpo UF, et al. Spatial distribution of schistosomiasis and treatment needs in sub-Saharan Africa: a systematic review and geostatistical analysis. Lancet Infect Dis 2015; published online May 22. http://dx.doi.org/10.1016/S1473-3099(15)00066-3—Figure 1 of this Article should have contained a box stating ‘100 references added’ with an arrow pointing inwards, rather than a box stating ‘199 records excluded’, and an asterisk should have been added after ‘1473 records extracted into GNTD’. Additionally, the positioning of the ‘§ and ‘†’ footnotes has been corrected in table 1. These corrections have been made to the online version as of June 4, 2015.

  19. Correction.

    PubMed

    2016-02-01

    In the article by Guessous et al (Guessous I, Pruijm M, Ponte B, Ackermann D, Ehret G, Ansermot N, Vuistiner P, Staessen J, Gu Y, Paccaud F, Mohaupt M, Vogt B, Pechère-Bertschi A, Martin PY, Burnier M, Eap CB, Bochud M. Associations of ambulatory blood pressure with urinary caffeine and caffeine metabolite excretions. Hypertension. 2015;65:691–696. doi: 10.1161/HYPERTENSIONAHA.114.04512), which published online ahead of print December 8, 2014, and appeared in the March 2015 issue of the journal, a correction was needed.One of the author surnames was misspelled. Antoinette Pechère-Berstchi has been corrected to read Antoinette Pechère-Bertschi.The authors apologize for this error.

  20. Correction

    NASA Astrophysics Data System (ADS)

    1998-12-01

    Alleged mosasaur bite marks on Late Cretaceous ammonites are limpet (patellogastropod) home scars Geology, v. 26, p. 947 950 (October 1998) This article had the following printing errors: p. 947, Abstract, line 11, “sepia” should be “septa” p. 947, 1st paragraph under Introduction, line 2, “creep” should be “deep” p. 948, column 1, 2nd paragraph, line 7, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 1, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 5, “19774” should be “1977)” p. 949, column 1, 4th paragraph, line 7, “in particular” should be “In particular” CORRECTION Mammalian community response to the latest Paleocene thermal maximum: An isotaphonomic study in the northern Bighorn Basin, Wyoming Geology, v. 26, p. 1011 1014 (November 1998) An error appeared in the References Cited. The correct reference appears below: Fricke, H. C., Clyde, W. C., O'Neil, J. R., and Gingerich, P. D., 1998, Evidence for rapid climate change in North America during the latest Paleocene thermal maximum: Oxygen isotope compositions of biogenic phosphate from the Bighorn Basin (Wyoming): Earth and Planetary Science Letters, v. 160, p. 193 208.

  1. [Beat therapeutic inertia in dyslipidemic patient management: A challenge in daily clinical practice] [corrected].

    PubMed

    Morales, Clotilde; Mauri, Marta; Vila, Lluís

    2014-01-01

    Beat therapeutic inertia in dyslipidemic patient management: a challenge in daily clinical practice. In patients with dyslipidemia, there is the need to reach the therapeutic goals in order to get the maximum benefit in the cardiovascular events risk reduction, especially myocardial infarction. Even having guidelines and some powerful hypolipidemic drugs, the goals of low-density lipoprotein-cholesterol (LDL-c) are often not reached, being of special in patients with a high cardiovascular risk. One of the causes is the therapeutic inertia. There are tools to plan the treatment and make the decisions easier. One of the challenges in everyday clinical practice is to know the needed percentage of reduction in LDL-c. Moreover: it is hard to know which one is the treatment we should use in the beginning of the treatment but also when the desired objective is not reached. This article proposes a practical method that can help solving these questions. Copyright © 2013 Sociedad Española de Arteriosclerosis. Published by Elsevier España. All rights reserved.

  2. Correcting Effect of Therapeutic Doses of Optical Radiation on Hematological Parameters of Blood Irradiated In Vivo

    NASA Astrophysics Data System (ADS)

    Zalesskaya, G. A.; Laskina, O. V.

    2017-07-01

    We studied the effect of therapeutic doses of optical radiation on the hematological parameters of blood irradiated in vivo: hemoglobin concentration, hematocrit, and the number of erythrocytes in the peripheral blood of patients during courses of extracorporeal, overvein, and intravenous blood irradiation and after treatment. The reversible changes during the procedures were found to differ from the changes obtained after treatment completion. At the end of the treatment course, the hematological parameters had changed in different directions and became higher, the same, or lower than the initial parameters depending on the initial parameters and photoinduced changes in blood oxygenation. A compensatory effect was found for photohemotherapy on oxygen-dependent processes altering the oxygen inflow into cells as well as the generation of active oxygen species and their inhibition by antioxidant systems.

  3. Gene Transfer Corrects Acute GM2 Gangliosidosis—Potential Therapeutic Contribution of Perivascular Enzyme Flow

    PubMed Central

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-01-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay–Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity—as opposed to tremor-ataxia—were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue—long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  4. Gene transfer corrects acute GM2 gangliosidosis--potential therapeutic contribution of perivascular enzyme flow.

    PubMed

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-08-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay-Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity-as opposed to tremor-ataxia-were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue-long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system.

  5. Tafenoquine at therapeutic concentrations does not prolong Fridericia-corrected QT interval in healthy subjects.

    PubMed

    Green, Justin A; Patel, Apurva K; Patel, Bela R; Hussaini, Azra; Harrell, Emma J; McDonald, Mirna J; Carter, Nick; Mohamed, Khadeeja; Duparc, Stephan; Miller, Ann K

    2014-09-01

    Tafenoquine is being developed for relapse prevention in Plasmodium vivax malaria. This Phase I, single-blind, randomized, placebo- and active-controlled parallel group study investigated whether tafenoquine at supratherapeutic and therapeutic concentrations prolonged cardiac repolarization in healthy volunteers. Subjects aged 18-65 years were randomized to one of five treatment groups (n = 52 per group) to receive placebo, tafenoquine 300, 600, or 1200 mg, or moxifloxacin 400 mg (positive control). Lack of effect was demonstrated if the upper 90% CI of the change from baseline in QTcF following supratherapeutic tafenoquine 1200 mg versus placebo (ΔΔQTcF) was <10 milliseconds for all pre-defined time points. The maximum ΔΔQTcF with tafenoquine 1200 mg (n = 50) was 6.39 milliseconds (90% CI 2.85, 9.94) at 72 hours post-final dose; that is, lack of effect for prolongation of cardiac depolarization was demonstrated. Tafenoquine 300 mg (n = 48) or 600 mg (n = 52) had no effect on ΔΔQTcF. Pharmacokinetic/pharmacodynamic modeling of the tafenoquine-QTcF concentration-effect relationship demonstrated a shallow slope (0.5 ms/μg mL(-1) ) over a wide concentration range. For moxifloxacin (n = 51), maximum ΔΔQTcF was 8.52 milliseconds (90% CI 5.00, 12.04), demonstrating assay sensitivity. In this thorough QT/QTc study, tafenoquine did not have a clinically meaningful effect on cardiac repolarization. © 2014, The American College of Clinical Pharmacology.

  6. Tafenoquine at therapeutic concentrations does not prolong fridericia-corrected QT interval in healthy subjects

    PubMed Central

    Green, Justin A; Patel, Apurva K; Patel, Bela R; Hussaini, Azra; Harrell, Emma J; McDonald, Mirna J; Carter, Nick; Mohamed, Khadeeja; Duparc, Stephan; Miller, Ann K

    2014-01-01

    Tafenoquine is being developed for relapse prevention in Plasmodium vivax malaria. This Phase I, single-blind, randomized, placebo- and active-controlled parallel group study investigated whether tafenoquine at supratherapeutic and therapeutic concentrations prolonged cardiac repolarization in healthy volunteers. Subjects aged 18–65 years were randomized to one of five treatment groups (n = 52 per group) to receive placebo, tafenoquine 300, 600, or 1200 mg, or moxifloxacin 400 mg (positive control). Lack of effect was demonstrated if the upper 90% CI of the change from baseline in QTcF following supratherapeutic tafenoquine 1200 mg versus placebo (ΔΔQTcF) was <10 milliseconds for all pre-defined time points. The maximum ΔΔQTcF with tafenoquine 1200 mg (n = 50) was 6.39 milliseconds (90% CI 2.85, 9.94) at 72 hours post-final dose; that is, lack of effect for prolongation of cardiac depolarization was demonstrated. Tafenoquine 300 mg (n = 48) or 600 mg (n = 52) had no effect on ΔΔQTcF. Pharmacokinetic/pharmacodynamic modeling of the tafenoquine–QTcF concentration–effect relationship demonstrated a shallow slope (0.5 ms/μg mL–1) over a wide concentration range. For moxifloxacin (n = 51), maximum ΔΔQTcF was 8.52 milliseconds (90% CI 5.00, 12.04), demonstrating assay sensitivity. In this thorough QT/QTc study, tafenoquine did not have a clinically meaningful effect on cardiac repolarization. PMID:24700490

  7. Comparison of the therapeutic effects of epoetin zeta and epoetin alpha in the correction of renal anaemia.

    PubMed

    Krivoshiev, Stefan; Todorov, Vasil V; Manitius, Jacek; Czekalski, Stanislaw; Scigalla, Paul; Koytchev, Rossen

    2008-05-01

    To assess the therapeutic equivalence of epoetin zeta and epoetin alpha for correction of haemoglobin (Hb) concentration in patients with anaemia and chronic kidney disease (CKD) stage 5 maintained on haemodialysis. In total, 609 patients with CKD and anaemia (Hb < 9 g/dL) were randomly assigned to receive either epoetin zeta or epoetin alpha intravenously, one to three times per week for 24 weeks. Dosing was titrated individually to achieve a stable, target Hb concentration of 11-12 g/dL. Primary endpoints were the mean weekly dose of epoetin per kilogram of body weight and mean Hb concentration during the last 4 weeks of treatment. Safety endpoints were the occurrence of anti-erythropoietin antibodies, ratings of tolerability and adverse events (AEs). Mean (+/- standard deviation [SD]) Hb concentration over the last 4 weeks of treatment was 11.61 +/- 1.27 g/dL for patients receiving epoetin zeta, compared with 11.63 +/- 1.37 g/dL for patients receiving epoetin alpha (95% confidence interval [CI]: -0.25 to 0.20 g/dL). Mean (+/- SD) epoetin zeta weekly dose over the last 4 weeks of treatment was 182.20 +/- 118.11 IU/kg/wk, compared with 166.14 +/- 109.85 IU/kg/wk for epoetin alpha (95% CI: -3.21 to 35.34 IU/kg/wk). The most commonly reported AEs (> 5% of patients) were infections and infestations (12.5% and 12.8% of patients treated with epoetin zeta and epoetin alpha, respectively) and vascular disorders (8.5% and 8.9%, respectively). No patients developed neutralizing anti-erythropoietin antibodies. Epoetin zeta, administered intravenously, is therapeutically equivalent to epoetin alpha in the correction of low Hb concentration in patients with CKD undergoing haemodialysis. No unexpected AEs were seen and both epoetin zeta and epoetin alpha were well tolerated.

  8. Travel cost demand model based river recreation benefit estimates with on-site and household surveys: Comparative results and a correction procedure

    NASA Astrophysics Data System (ADS)

    Loomis, John

    2003-04-01

    Past recreation studies have noted that on-site or visitor intercept surveys are subject to over-sampling of avid users (i.e., endogenous stratification) and have offered econometric solutions to correct for this. However, past papers do not estimate the empirical magnitude of the bias in benefit estimates with a real data set, nor do they compare the corrected estimates to benefit estimates derived from a population sample. This paper empirically examines the magnitude of the recreation benefits per trip bias by comparing estimates from an on-site river visitor intercept survey to a household survey. The difference in average benefits is quite large, with the on-site visitor survey yielding 24 per day trip, while the household survey yields 9.67 per day trip. A simple econometric correction for endogenous stratification in our count data model lowers the benefit estimate to $9.60 per day trip, a mean value nearly identical and not statistically different from the household survey estimate.

  9. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization

    PubMed Central

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F.; Pal, Saikat; McWalter, Emily J.; Beaupré, Gary S.; Maier, Andreas

    2013-01-01

    Purpose: Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. Methods: An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers’ location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4–12) and

  10. Winner's Curse Correction and Variable Thresholding Improve Performance of Polygenic Risk Modeling Based on Genome-Wide Association Study Summary-Level Data

    PubMed Central

    Shi, Jianxin; Duan, Jubao; Berndt, Sonja T.; Moy, Winton; Yu, Kai; Song, Lei; Wheeler, William; Hua, Xing; Silverman, Debra; Garcia-Closas, Montserrat; Hsiung, Chao Agnes; Figueroa, Jonine D.; Cortessis, Victoria K.; Malats, Núria; Karagas, Margaret R.; Vineis, Paolo; Chang, I-Shou; Lin, Dongxin; Zhou, Baosen; Seow, Adeline; Hong, Yun-Chul; Caporaso, Neil E.; Wolpin, Brian; Jacobs, Eric; Petersen, Gloria M.; Klein, Alison P.; Li, Donghui; Risch, Harvey; Sanders, Alan R.; Hsu, Li; Schoen, Robert E.; Brenner, Hermann; Stolzenberg-Solomon, Rachael; Gejman, Pablo; Lan, Qing; Rothman, Nathaniel; Amundadottir, Laufey T.; Landi, Maria Teresa; Levinson, Douglas F.; Chanock, Stephen J.; Chatterjee, Nilanjan

    2016-01-01

    Recent heritability analyses have indicated that genome-wide association studies (GWAS) have the potential to improve genetic risk prediction for complex diseases based on polygenic risk score (PRS), a simple modelling technique that can be implemented using summary-level data from the discovery samples. We herein propose modifications to improve the performance of PRS. We introduce threshold-dependent winner’s-curse adjustments for marginal association coefficients that are used to weight the single-nucleotide polymorphisms (SNPs) in PRS. Further, as a way to incorporate external functional/annotation knowledge that could identify subsets of SNPs highly enriched for associations, we propose variable thresholds for SNPs selection. We applied our methods to GWAS summary-level data of 14 complex diseases. Across all diseases, a simple winner’s curse correction uniformly led to enhancement of performance of the models, whereas incorporation of functional SNPs was beneficial only for selected diseases. Compared to the standard PRS algorithm, the proposed methods in combination led to notable gain in efficiency (25–50% increase in the prediction R2) for 5 of 14 diseases. As an example, for GWAS of type 2 diabetes, winner’s curse correction improved prediction R2 from 2.29% based on the standard PRS to 3.10% (P = 0.0017) and incorporating functional annotation data further improved R2 to 3.53% (P = 2×10−5). Our simulation studies illustrate why differential treatment of certain categories of functional SNPs, even when shown to be highly enriched for GWAS-heritability, does not lead to proportionate improvement in genetic risk-prediction because of non-uniform linkage disequilibrium structure. PMID:28036406

  11. Winner's Curse Correction and Variable Thresholding Improve Performance of Polygenic Risk Modeling Based on Genome-Wide Association Study Summary-Level Data.

    PubMed

    Shi, Jianxin; Park, Ju-Hyun; Duan, Jubao; Berndt, Sonja T; Moy, Winton; Yu, Kai; Song, Lei; Wheeler, William; Hua, Xing; Silverman, Debra; Garcia-Closas, Montserrat; Hsiung, Chao Agnes; Figueroa, Jonine D; Cortessis, Victoria K; Malats, Núria; Karagas, Margaret R; Vineis, Paolo; Chang, I-Shou; Lin, Dongxin; Zhou, Baosen; Seow, Adeline; Matsuo, Keitaro; Hong, Yun-Chul; Caporaso, Neil E; Wolpin, Brian; Jacobs, Eric; Petersen, Gloria M; Klein, Alison P; Li, Donghui; Risch, Harvey; Sanders, Alan R; Hsu, Li; Schoen, Robert E; Brenner, Hermann; Stolzenberg-Solomon, Rachael; Gejman, Pablo; Lan, Qing; Rothman, Nathaniel; Amundadottir, Laufey T; Landi, Maria Teresa; Levinson, Douglas F; Chanock, Stephen J; Chatterjee, Nilanjan

    2016-12-01

    Recent heritability analyses have indicated that genome-wide association studies (GWAS) have the potential to improve genetic risk prediction for complex diseases based on polygenic risk score (PRS), a simple modelling technique that can be implemented using summary-level data from the discovery samples. We herein propose modifications to improve the performance of PRS. We introduce threshold-dependent winner's-curse adjustments for marginal association coefficients that are used to weight the single-nucleotide polymorphisms (SNPs) in PRS. Further, as a way to incorporate external functional/annotation knowledge that could identify subsets of SNPs highly enriched for associations, we propose variable thresholds for SNPs selection. We applied our methods to GWAS summary-level data of 14 complex diseases. Across all diseases, a simple winner's curse correction uniformly led to enhancement of performance of the models, whereas incorporation of functional SNPs was beneficial only for selected diseases. Compared to the standard PRS algorithm, the proposed methods in combination led to notable gain in efficiency (25-50% increase in the prediction R2) for 5 of 14 diseases. As an example, for GWAS of type 2 diabetes, winner's curse correction improved prediction R2 from 2.29% based on the standard PRS to 3.10% (P = 0.0017) and incorporating functional annotation data further improved R2 to 3.53% (P = 2×10-5). Our simulation studies illustrate why differential treatment of certain categories of functional SNPs, even when shown to be highly enriched for GWAS-heritability, does not lead to proportionate improvement in genetic risk-prediction because of non-uniform linkage disequilibrium structure.

  12. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    PubMed Central

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2014-01-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For OSEM, image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-fluorodeoxyglucose dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation GTM PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in CMRGlc estimates, although by less than 5% in most cases compared to the other PVC methods. The results indicate that the PVC implementation

  13. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2013-10-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose (18F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in most

  14. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  15. Determination of the quenching correction factors for plastic scintillation detectors in therapeutic high-energy proton beams

    PubMed Central

    Wang, L L W; Perles, L A; Archambault, L; Sahoo, N; Mirkovic, D; Beddar, S

    2013-01-01

    The plastic scintillation detectors (PSD) have many advantages over other detectors in small field dosimetry due to its high spatial resolution, excellent water equivalence and instantaneous readout. However, in proton beams, the PSDs will undergo a quenching effect which makes the signal level reduced significantly when the detector is close to Bragg peak where the linear energy transfer (LET) for protons is very high. This study measures the quenching correction factor (QCF) for a PSD in clinical passive-scattering proton beams and investigates the feasibility of using PSDs in depth-dose measurements in proton beams. A polystyrene based PSD (BCF-12, ϕ0.5mm×4mm) was used to measure the depth-dose curves in a water phantom for monoenergetic unmodulated proton beams of nominal energies 100, 180 and 250 MeV. A Markus plane-parallel ion chamber was also used to get the dose distributions for the same proton beams. From these results, the QCF as a function of depth was derived for these proton beams. Next, the LET depth distributions for these proton beams were calculated by using the MCNPX Monte Carlo code, based on the experimentally validated nozzle models for these passive-scattering proton beams. Then the relationship between the QCF and the proton LET could be derived as an empirical formula. Finally, the obtained empirical formula was applied to the PSD measurements to get the corrected depth-dose curves and they were compared to the ion chamber measurements. A linear relationship between QCF and LET, i.e. Birks' formula, was obtained for the proton beams studied. The result is in agreement with the literature. The PSD measurements after the quenching corrections agree with ion chamber measurements within 5%. PSDs are good dosimeters for proton beam measurement if the quenching effect is corrected appropriately. PMID:23128412

  16. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC

    PubMed Central

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  17. Predictive factors for obtaining a correct therapeutic range using antivitamin K anticoagulants: a tertiary center experience of patient adherence to anticoagulant therapy

    PubMed Central

    Jurcuţ, Ruxandra; Militaru, Sebastian; Geavlete, Oliviana; Drăgotoiu, Nic; Sipoş, Sergiu; Roşulescu, Răzvan; Ginghină, Carmen; Jurcuţ, Ciprian

    2015-01-01

    Background Patient adherence is an essential factor in obtaining efficient oral anticoagulation using vitamin K antagonists (VKAs), a situation with a narrow therapeutic window. Therefore, patient education and awareness are crucial for good management. Auditing the current situation would help to identify the magnitude of the problem and to build tailored education programs for these patients. Methods This study included 68 hospitalized chronically anticoagulated patients (mean age 62.6±13.1 years; males, 46%) who responded to a 26-item questionnaire to assess their knowledge on VKA therapy management. Laboratory and clinical data were used to determine the international normalized ratio (INR) at admission, as well as to calculate CHA2DS2-VASC and HAS-BLED scores for patients with atrial fibrillation. Results The majority of patients (62%) were receiving VKA for atrial fibrillation, the others for a mechanical prosthesis and previous thromboembolic disease or stroke. In the atrial fibrillation group, the mean CHA2DS2-VASC score was 3.1±1.5, while the average HAS-BLED score was 1.8±1.2. More than half of the patients (53%) had an INR outside of the therapeutic range at admission, with the majority (43%) having a low INR. A correct INR value was predicted by education level (higher education) and the diagnostic indication (patients with mechanical prosthesis being best managed). Patients presenting with a therapeutic INR had a trend toward longer treatment duration than those outside the therapeutic range (62±72 months versus 36±35 months, respectively, P=0.06). There was no correlation between INR at admission and the patient’s living conditions, INR monitoring frequency, and bleeding history. Conclusion In a tertiary cardiology center, more than half of patients receiving VKAs are admitted with an INR falling outside the therapeutic range, irrespective of the bleeding or embolic risk. Patients with a mechanical prosthesis and complex antithrombotic regimens

  18. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics

    PubMed Central

    Chung, W. Joon; Goeckeler-Fried, Jennifer L.; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M.; Plyler, Zackery E.; Hong, Jeong S.; Mazur, Marina; Piazza, Gary A.; Keeton, Adam B.; White, E. Lucile; Rasmussen, Lynn; Weissman, Allan M.; Denny, R. Aldrin; Brodsky, Jeffrey L.; Sorscher, Eric J.

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of “repairable” F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident “Band B” F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41—a compound that inhibits the E1 ubiquitin activating enzyme—was shown to synergistically enhance F508del rescue by C

  19. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics.

    PubMed

    Chung, W Joon; Goeckeler-Fried, Jennifer L; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M; Plyler, Zackery E; Hong, Jeong S; Mazur, Marina; Piazza, Gary A; Keeton, Adam B; White, E Lucile; Rasmussen, Lynn; Weissman, Allan M; Denny, R Aldrin; Brodsky, Jeffrey L; Sorscher, Eric J

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of "repairable" F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident "Band B" F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41-a compound that inhibits the E1 ubiquitin activating enzyme-was shown to synergistically enhance F508del rescue by C18, a small

  20. CORRECTED ERROR VIDEO VERSUS A PHYSICAL THERAPIST INSTRUCTED HOME EXERCISE PROGRAM: ACCURACY OF PERFORMING THERAPEUTIC SHOULDER EXERCISES

    PubMed Central

    Krishnamurthy, Kamesh; Hopp, Jennifer; Stanley, Laura; Spores, Ken; Braunreiter, David

    2016-01-01

    Background and Purpose The accurate performance of physical therapy exercises can be difficult. In this evolving healthcare climate it is important to continually look for better methods to educate patients. The use of handouts, in-person demonstration, and video instruction are all potential avenues used to teach proper exercise form. The purpose of this study was to examine if a corrected error video (CEV) would be as effective as a single visit with a physical therapist (PT) to teach healthy subjects how to properly perform four different shoulder rehabilitation exercises. Study Design This was a prospective, single-blinded interventional trial. Methods Fifty-eight subjects with no shoulder complaints were recruited from two institutions and randomized into one of two groups: the CEV group (30 subjects) was given a CEV comprised of four shoulder exercises, while the physical therapy group (28 subjects) had one session with a PT as well as a handout of how to complete the exercises. Each subject practiced the exercises for one week and was then videotaped performing them during a return visit. Videos were scored with the shoulder exam assessment tool (SEAT) created by the authors. Results There was no difference between the groups on total SEAT score (13.66 ± 0.29 vs 13.46 ± 0.30 for CEV vs PT, p = 0.64, 95% CI [−0.06, 0.037]). Average scores for individual exercises also showed no significant difference. Conclusion/Clinical Relevance These results demonstrate that the inexpensive and accessible CEV is as beneficial as direct instruction in teaching subjects to properly perform shoulder rehabilitation exercises. Level of Evidence 1b PMID:27757288

  1. PET/MRI for Oncologic Brain Imaging: A Comparison of Standard MR-Based Attenuation Corrections with a Model-Based Approach for the Siemens mMR PET/MR System.

    PubMed

    Rausch, Ivo; Rischka, Lucas; Ladefoged, Claes N; Furtner, Julia; Fenchel, Matthias; Hahn, Andreas; Lanzenberger, Rupert; Mayerhoefer, Marius E; Traub-Weidinger, Tatjana; Beyer, Thomas

    2017-09-01

    The aim of this study was to compare attenuation-correction (AC) approaches for PET/MRI in clinical neurooncology. Methods: Forty-nine PET/MRI brain scans were included: brain tumor studies using (18)F-fluoro-ethyl-tyrosine ((18)F-FET) (n = 31) and (68)Ga-DOTANOC (n = 7) and studies of healthy subjects using (18)F-FDG (n = 11). For each subject, MR-based AC maps (MR-AC) were acquired using the standard DIXON- and ultrashort echo time (UTE)-based approaches. A third MR-AC was calculated using a model-based, postprocessing approach to account for bone attenuation values (BD, noncommercial prototype software by Siemens Healthcare). As a reference, AC maps were derived from patient-specific CT images (CTref). PET data were reconstructed using standard settings after AC with all 4 AC methods. We report changes in diagnosis for all brain tumor patients and the following relative differences values (RDs [%]), with regards to AC-CTref: for (18)F-FET (A)-SUVs as well as volumes of interest (VOIs) defined by a 70% threshold of all segmented lesions and lesion-to-background ratios; for (68)Ga-DOTANOC (B)-SUVs as well as VOIs defined by a 50% threshold for all lesions and the pituitary gland; and for (18)F-FDG (C)-RD of SUVs of the whole brain and 10 anatomic regions segmented on MR images. Results: For brain tumor imaging (A and B), the standard PET-based diagnosis was not affected by any of the 3 MR-AC methods. For A, the average RDs of SUVmean were -10%, -4%, and -3% and of the VOIs 1%, 2%, and 7% for DIXON, UTE, and BD, respectively. Lesion-to-background ratios for all MR-AC methods were similar to that of CTref. For B, average RDs of SUVmean were -11%, -11%, and -3% and of the VOIs 1%, -4%, and -3%, respectively. In the case of (18)F-FDG PET/MRI (C), RDs for the whole brain were -11%, -8%, and -5% for DIXON, UTE, and BD, respectively. Conclusion: The diagnostic reading of PET/MR patients with brain tumors did not change with the chosen AC method. Quantitative accuracy of

  2. Correction of Murine Rag2 Severe Combined Immunodeficiency by Lentiviral Gene Therapy Using a Codon-optimized RAG2 Therapeutic Transgene

    PubMed Central

    van Til, Niek P; de Boer, Helen; Mashamba, Nomusa; Wabik, Agnieszka; Huston, Marshall; Visser, Trudi P; Fontana, Elena; Poliani, Pietro Luigi; Cassani, Barbara; Zhang, Fang; Thrasher, Adrian J; Villa, Anna; Wagemaker, Gerard

    2012-01-01

    Recombination activating gene 2 (RAG2) deficiency results in severe combined immunodeficiency (SCID) with complete lack of T and B lymphocytes. Initial gammaretroviral gene therapy trials for other types of SCID proved effective, but also revealed the necessity of safe vector design. We report the development of lentiviral vectors with the spleen focus forming virus (SF) promoter driving codon-optimized human RAG2 (RAG2co), which improved phenotype amelioration compared to native RAG2 in Rag2−/− mice. With the RAG2co therapeutic transgene, T-cell receptor (TCR) and immunoglobulin repertoire, T-cell mitogen responses, plasma immunoglobulin levels and T-cell dependent and independent specific antibody responses were restored. However, the thymus double positive T-cell population remained subnormal, possibly due to the SF virus derived element being sensitive to methylation/silencing in the thymus, which was prevented by replacing the SF promoter by the previously reported silencing resistant element (ubiquitous chromatin opening element (UCOE)), and also improved B-cell reconstitution to eventually near normal levels. Weak cellular promoters were effective in T-cell reconstitution, but deficient in B-cell reconstitution. We conclude that immune functions are corrected in Rag2−/− mice by genetic modification of stem cells using the UCOE driven codon-optimized RAG2, providing a valid optional vector for clinical implementation. PMID:22692499

  3. A Physical Model-based Correction for Charge Traps in the Hubble Space Telescope’s Wide Field Camera 3 Near-IR Detector and Its Applications to Transiting Exoplanets and Brown Dwarfs

    NASA Astrophysics Data System (ADS)

    Zhou, Yifan; Apai, Dániel; Lew, Ben W. P.; Schneider, Glenn

    2017-06-01

    The Hubble Space Telescope Wide Field Camera 3 (WFC3) near-IR channel is extensively used in time-resolved observations, especially for transiting exoplanet spectroscopy as well as brown dwarf and directly imaged exoplanet rotational phase mapping. The ramp effect is the dominant source of systematics in the WFC3 for time-resolved observations, which limits its photometric precision. Current mitigation strategies are based on empirical fits and require additional orbits to help the telescope reach a thermal equilibrium. We show that the ramp-effect profiles can be explained and corrected with high fidelity using charge trapping theories. We also present a model for this process that can be used to predict and to correct charge trap systematics. Our model is based on a very small number of parameters that are intrinsic to the detector. We find that these parameters are very stable between the different data sets, and we provide best-fit values. Our model is tested with more than 120 orbits (∼40 visits) of WFC3 observations and is proved to be able to provide near photon noise limited corrections for observations made with both staring and scanning modes of transiting exoplanets as well as for starting-mode observations of brown dwarfs. After our model correction, the light curve of the first orbit in each visit has the same photometric precision as subsequent orbits, so data from the first orbit no longer need to be discarded. Near-IR arrays with the same physical characteristics (e.g., JWST/NIRCam) may also benefit from the extension of this model if similar systematic profiles are observed.

  4. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  5. Model-Based Improvement

    DTIC Science & Technology

    2006-10-01

    2006 4. TITLE AND SUBTITLE Model-Based Improvement 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Carnegie Mellon University ,Software Engineering...Institute (SEI),Pittsburgh,PA,15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S

  6. Inkjet printer model-based halftoning.

    PubMed

    Lee, Je-Ho; Allebach, Jan P

    2005-05-01

    The quality of halftone prints produced by inkjet (IJ) printers can be limited by random dot-placement errors. While a large literature addresses model-based halftoning for electrophotographic printers, little work has been done on model-based halftoning for IJ printers. In this paper, we propose model-based approaches to both iterative least-squares halftoning and tone-dependent error diffusion (TDED). The particular approach to iterative least-squares halftoning that we use is direct binary search (DBS). For DBS, we use a stochastic model for the equivalent gray-scale image, based on measured dot statistics of printed IJ halftone patterns. For TDED, we train the tone-dependent weights and thresholds to mimic the spectrum of halftone textures generated by model-based DBS. We do this under a metric that enforces both the correct radially averaged spectral profile and angular symmetry at each radial frequency. Experimental results generated with simulated printers and a real printer show that both IJ model-based DBS and IJ model-based TDED very effectively suppress IJ printer-induced artifacts.

  7. The lumbar lordosis index: a new ratio to detect spinal malalignment with a therapeutic impact for sagittal balance correction decisions in adult scoliosis surgery.

    PubMed

    Boissière, Louis; Bourghli, Anouar; Vital, Jean-Marc; Gille, Olivier; Obeid, Ibrahim

    2013-06-01

    Sagittal malalignment is frequently observed in adult scoliosis. C7 plumb line, lumbar lordosis and pelvic tilt are the main factors to evaluate sagittal balance and the need of a vertebral osteotomy to correct it. We described a ratio: the lumbar lordosis index (ratio lumbar lordosis/pelvic incidence) (LLI) and analyzed its relationships with spinal malalignment and vertebral osteotomies. 53 consecutive patients with a surgical adult scoliosis had preoperative and postoperative full spine EOS radiographies to measure spino-pelvic parameters and LLI. The lack of lordosis was calculated after prediction of theoretical lumbar lordosis. Correlation analysis between the different parameters was performed. All parameters were correlated with spinal malalignment but LLI is the most correlated parameter (r = -0.978). It is also the best parameter in this study to predict the need of a spinal osteotomy (r = 1 if LLI <0.5). LLI is a statistically validated parameter for sagittal malalignment analysis. It can be used as a mathematical tool to detect spinal malalignment in adult scoliosis and guides the surgeon decision of realizing a vertebral osteotomy for adult scoliosis sagittal correction. It can be used as well for the interpretation of clinical series in adult scoliosis.

  8. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  9. Correction of low HDL cholesterol to reduce cardiovascular risk: practical considerations relating to the therapeutic use of prolonged-release nicotinic acid (Niaspan).

    PubMed

    Vogt, A; Kassner, U; Hostalek, U; Steinhagen-Thiessen, E

    2007-11-01

    Substantial residual cardiovascular risk persists despite effective LDL lowering treatment in populations at elevated risk for adverse cardiovascular outcomes. Low HDL cholesterol is an independent cardiovascular risk factor and occurs in about one-third of patients treated for dyslipidaemia in Europe. Moreover, randomised intervention studies have shown that increasing HDL cholesterol improves cardiovascular outcomes. Correcting low HDL cholesterol therefore presents a rational and proven strategy for intervention to produce further reductions in cardiovascular risk beyond those possible with a statin alone. Nicotinic acid (niacin in the USA) is the most effective agent currently available for increasing levels of HDL cholesterol. A once-daily, prolonged-release formulation of nicotinic acid (Niaspan) is as effective on HDL cholesterol as the immediate-release formulation, and is equally effective at increasing HDL cholesterol whether or not patients are already taking a statin. Niaspan also shares the antiatherogenic benefit of nicotinic acid, and induced regression of atherosclerosis in patients with cardiovascular disease during a period of treatment of up to 2 years. The incidence of flushing, the principal side effect of nicotinic acid, is lower with Niaspan than with immediate-release nicotinic acid. Simple practical measures are available to minimise the incidence and impact of flushing, including careful dose titration and avoiding hot or spicy foods near the time of ingestion of Niaspan. The potential for hepatotoxicity, muscle toxicity or marked exacerbation of hyperglycaemia in diabetes with Niaspan is very low, with or without concomitant statin treatment. Niaspan provides a practical means of delivering the cardioprotective benefits associated with correction of low HDL cholesterol.

  10. Political Correctness--Correct?

    ERIC Educational Resources Information Center

    Boase, Paul H.

    1993-01-01

    Examines the phenomenon of political correctness, its roots and objectives, and its successes and failures in coping with the conflicts and clashes of multicultural campuses. Argues that speech codes indicate failure in academia's primary mission to civilize and educate through talk, discussion, thought,166 and persuasion. (SR)

  11. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  12. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  13. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  14. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  15. Model-based machine learning

    PubMed Central

    Bishop, Christopher M.

    2013-01-01

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  16. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  17. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  18. Corrective work.

    ERIC Educational Resources Information Center

    Hill, Leslie A.

    1978-01-01

    Discusses some general principles for planning corrective instruction and exercises in English as a second language, and follows with examples from the areas of phonemics, phonology, lexicon, idioms, morphology, and syntax. (IFS/WGA)

  19. The Challenge of Configuring Model-Based Space Mission Planners

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy D.; Clement, Bradley J.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    Mission planning is central to space mission operations, and has benefited from advances in model-based planning software. Constraints arise from many sources, including simulators and engineering specification documents, and ensuring that constraints are correctly represented in the planner is a challenge. As mission constraints evolve, planning domain modelers need help with modeling constraints efficiently using the available source data, catching errors quickly, and correcting the model. This paper describes the current state of the practice in designing model-based mission planning tools, the challenges facing model developers, and a proposed Interactive Model Development Environment (IMDE) to configure mission planning systems. We describe current and future technology developments that can be integrated into an IMDE.

  20. The Challenge of Configuring Model-Based Space Mission Planners

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy D.; Clement, Bradley J.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    Mission planning is central to space mission operations, and has benefited from advances in model-based planning software. Constraints arise from many sources, including simulators and engineering specification documents, and ensuring that constraints are correctly represented in the planner is a challenge. As mission constraints evolve, planning domain modelers need help with modeling constraints efficiently using the available source data, catching errors quickly, and correcting the model. This paper describes the current state of the practice in designing model-based mission planning tools, the challenges facing model developers, and a proposed Interactive Model Development Environment (IMDE) to configure mission planning systems. We describe current and future technology developments that can be integrated into an IMDE.

  1. A tool for model based diagnostics of the AGS Booster

    SciTech Connect

    Luccio, A.

    1993-12-31

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation.

  2. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  3. Therapeutic ERCP

    MedlinePlus

    ... the recovery unit. You should not drive a car for the rest of the day although most patients can return to full activity the next day. What are possible complications of therapeutic ERCP? The overall ERCP complication rate requiring hospitalization ...

  4. Therapeutic Recreation

    ERIC Educational Resources Information Center

    Parks and Recreation, 1971

    1971-01-01

    Graphic profiles of (1) the professional membership of the National Therapeutic Recreation Society, (2) state-level employment opportunities in the field, and (3) educational opportunities at U.S. colleges and universities. (MB)

  5. Jitter Correction

    NASA Technical Reports Server (NTRS)

    Waegell, Mordecai J.; Palacios, David M.

    2011-01-01

    Jitter_Correct.m is a MATLAB function that automatically measures and corrects inter-frame jitter in an image sequence to a user-specified precision. In addition, the algorithm dynamically adjusts the image sample size to increase the accuracy of the measurement. The Jitter_Correct.m function takes an image sequence with unknown frame-to-frame jitter and computes the translations of each frame (column and row, in pixels) relative to a chosen reference frame with sub-pixel accuracy. The translations are measured using a Cross Correlation Fourier transformation method in which the relative phase of the two transformed images is fit to a plane. The measured translations are then used to correct the inter-frame jitter of the image sequence. The function also dynamically expands the image sample size over which the cross-correlation is measured to increase the accuracy of the measurement. This increases the robustness of the measurement to variable magnitudes of inter-frame jitter

  6. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  7. Speech Correction in the Schools.

    ERIC Educational Resources Information Center

    Eisenson, Jon; Ogilvie, Mardel

    An introduction to the problems and therapeutic needs of school age children whose speech requires remedial attention, the text is intended for both the classroom teacher and the speech correctionist. General considerations include classification and incidence of speech defects, speech correction services, the teacher as a speaker, the mechanism…

  8. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  9. MACROMOLECULAR THERAPEUTICS

    PubMed Central

    Yang, Jiyuan; Kopeček, Jindřich

    2014-01-01

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines – (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. PMID:24747162

  10. Macromolecular therapeutics.

    PubMed

    Yang, Jiyuan; Kopeček, Jindřich

    2014-09-28

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines - (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated.

  11. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen C.; Ruegsegger, Mark; Barnes, Philip D.; Smith, Bryan R.; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'etre of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multi-step work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  12. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen; Ruegsegger, Mark; Barnes, Philip; Smith, Bryan; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'être of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multistep work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self-assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  13. Feedlot therapeutics.

    PubMed

    Apley, M D; Fajt, V R

    1998-07-01

    This article discusses therapeutic approaches to conditions commonly encountered in feedlots. Challenges discussed include bovine respiratory complex, tracheal edema, atypical interstitial pneumonia, footrot, toe abscesses, mycoplasma arthritis, cardiovascular disease, lactic acidosis, bloat, coccidiosis, central nervous system diseases, abscesses and cellulitis, pregnancy management and abortion, and ocular disease.

  14. Therapeutic proteins.

    PubMed

    Dimitrov, Dimiter S

    2012-01-01

    Protein-based therapeutics are highly successful in clinic and currently enjoy unprecedented recognition of their potential. More than 100 genuine and similar number of modified therapeutic proteins are approved for clinical use in the European Union and the USA with 2010 sales of US$108 bln; monoclonal antibodies (mAbs) accounted for almost half (48%) of the sales. Based on their pharmacological activity, they can be divided into five groups: (a) replacing a protein that is deficient or abnormal; (b) augmenting an existing pathway; (c) providing a novel function or activity; (d) interfering with a molecule or organism; and (e) delivering other compounds or proteins, such as a radionuclide, cytotoxic drug, or effector proteins. Therapeutic proteins can also be grouped based on their molecular types that include antibody-based drugs, Fc fusion proteins, anticoagulants, blood factors, bone morphogenetic proteins, engineered protein scaffolds, enzymes, growth factors, hormones, interferons, interleukins, and thrombolytics. They can also be classified based on their molecular mechanism of activity as (a) binding non-covalently to target, e.g., mAbs; (b) affecting covalent bonds, e.g., enzymes; and (c) exerting activity without specific interactions, e.g., serum albumin. Most protein therapeutics currently on the market are recombinant and hundreds of them are in clinical trials for therapy of cancers, immune disorders, infections, and other diseases. New engineered proteins, including bispecific mAbs and multispecific fusion proteins, mAbs conjugated with small molecule drugs, and proteins with optimized pharmacokinetics, are currently under development. However, in the last several decades, there are no conceptually new methodological developments comparable, e.g., to genetic engineering leading to the development of recombinant therapeutic proteins. It appears that a paradigm change in methodologies and understanding of mechanisms is needed to overcome major

  15. Platelet-delivered therapeutics.

    PubMed

    Lyde, R; Sabatino, D; Sullivan, S K; Poncz, M

    2015-06-01

    We have proposed that modified platelets could potentially be used to correct intrinsic platelet defects as well as for targeted delivery of therapeutic molecules to sights of vascular injury. Ectopic expression of proteins within α-granules prior to platelet activation has been achieved for several proteins, including urokinase, factor (F) VIII, and partially for FIX. Potential uses of platelet-directed therapeutics will be discussed, focusing on targeted delivery of urokinase as a thromboprophylactic agent and FVIII for the treatment of hemophilia A patients with intractable inhibitors. This presentation will discuss new strategies that may be useful in the care of patients with vascular injury as well as remaining challenges and limitations of these approaches.

  16. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch.

  17. Fast model-based estimation of ancestry in unrelated individuals

    PubMed Central

    Alexander, David H.; Novembre, John; Lange, Kenneth

    2009-01-01

    Population stratification has long been recognized as a confounding factor in genetic association studies. Estimated ancestries, derived from multi-locus genotype data, can be used to perform a statistical correction for population stratification. One popular technique for estimation of ancestry is the model-based approach embodied by the widely applied program structure. Another approach, implemented in the program EIGENSTRAT, relies on Principal Component Analysis rather than model-based estimation and does not directly deliver admixture fractions. EIGENSTRAT has gained in popularity in part owing to its remarkable speed in comparison to structure. We present a new algorithm and a program, ADMIXTURE, for model-based estimation of ancestry in unrelated individuals. ADMIXTURE adopts the likelihood model embedded in structure. However, ADMIXTURE runs considerably faster, solving problems in minutes that take structure hours. In many of our experiments, we have found that ADMIXTURE is almost as fast as EIGENSTRAT. The runtime improvements of ADMIXTURE rely on a fast block relaxation scheme using sequential quadratic programming for block updates, coupled with a novel quasi-Newton acceleration of convergence. Our algorithm also runs faster and with greater accuracy than the implementation of an Expectation-Maximization (EM) algorithm incorporated in the program FRAPPE. Our simulations show that ADMIXTURE's maximum likelihood estimates of the underlying admixture coefficients and ancestral allele frequencies are as accurate as structure's Bayesian estimates. On real-world data sets, ADMIXTURE's estimates are directly comparable to those from structure and EIGENSTRAT. Taken together, our results show that ADMIXTURE's computational speed opens up the possibility of using a much larger set of markers in model-based ancestry estimation and that its estimates are suitable for use in correcting for population stratification in association studies. PMID:19648217

  18. Fast model-based estimation of ancestry in unrelated individuals.

    PubMed

    Alexander, David H; Novembre, John; Lange, Kenneth

    2009-09-01

    Population stratification has long been recognized as a confounding factor in genetic association studies. Estimated ancestries, derived from multi-locus genotype data, can be used to perform a statistical correction for population stratification. One popular technique for estimation of ancestry is the model-based approach embodied by the widely applied program structure. Another approach, implemented in the program EIGENSTRAT, relies on Principal Component Analysis rather than model-based estimation and does not directly deliver admixture fractions. EIGENSTRAT has gained in popularity in part owing to its remarkable speed in comparison to structure. We present a new algorithm and a program, ADMIXTURE, for model-based estimation of ancestry in unrelated individuals. ADMIXTURE adopts the likelihood model embedded in structure. However, ADMIXTURE runs considerably faster, solving problems in minutes that take structure hours. In many of our experiments, we have found that ADMIXTURE is almost as fast as EIGENSTRAT. The runtime improvements of ADMIXTURE rely on a fast block relaxation scheme using sequential quadratic programming for block updates, coupled with a novel quasi-Newton acceleration of convergence. Our algorithm also runs faster and with greater accuracy than the implementation of an Expectation-Maximization (EM) algorithm incorporated in the program FRAPPE. Our simulations show that ADMIXTURE's maximum likelihood estimates of the underlying admixture coefficients and ancestral allele frequencies are as accurate as structure's Bayesian estimates. On real-world data sets, ADMIXTURE's estimates are directly comparable to those from structure and EIGENSTRAT. Taken together, our results show that ADMIXTURE's computational speed opens up the possibility of using a much larger set of markers in model-based ancestry estimation and that its estimates are suitable for use in correcting for population stratification in association studies.

  19. 77 FR 72199 - Technical Corrections; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... COMMISSION 10 CFR Part 171 RIN 3150-AJ16 Technical Corrections; Correction AGENCY: Nuclear Regulatory... corrections, including updating the street address for the Region I office, correcting authority citations and... rule. DATES: The correction is effective on December 5, 2012. FOR FURTHER INFORMATION CONTACT:...

  20. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  1. Model-based fault detection and diagnosis in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Carrasco, Rodrigo A.

    2016-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) observatory, with its 66 individual telescopes and other central equipment, generates a massive set of monitoring data every day, collecting information on the performance of a variety of critical and complex electrical, electronic and mechanical components. This data is crucial for most troubleshooting efforts performed by engineering teams. More than 5 years of accumulated data and expertise allow for a more systematic approach to fault detection and diagnosis. This paper presents model-based fault detection and diagnosis techniques to support corrective and predictive maintenance in a 24/7 minimum-downtime observatory.

  2. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ..., 50, 52, and 70 RIN 3150-AJ23 Miscellaneous Corrections; Corrections AGENCY: Nuclear Regulatory... final rule in the Federal Register on June 7, 2013, to make miscellaneous corrections to its regulations... miscellaneous corrections to its regulations in chapter I of Title 10 of the Code of Federal Regulations (10...

  3. Study of model based etch bias retarget for OPC

    NASA Astrophysics Data System (ADS)

    Liu, Qingwei; Cheng, Renqiang; Zhang, Liguo

    2010-04-01

    Model based Optical proximity correction is usually used to compensate for the pattern distortion during the microlithography process. Currently, almost all the lithography effects, such as the proximity effects from the limited NA, the 3D mask effects due to the shrinking critical dimension, the photo resist effects, and some other well known physical process, can all be well considered into modeling with the OPC algorithm. However, the micro-lithography is not the final step of the pattern transformation procedure from the mask to the wafer. The etch process is also a very important stage. It is well known that till now, the etch process still can't be well explained by physics theory. As we all know, the final critical dimension is decided by both the lithography and the etch process. If the etch bias, which is the difference between the post development CD and the post etch CD, is a constant value, it will be simple to control the final CD. But unfortunately this is always not the case. For advanced technology nodes with shrinking critical dimension, the etch loading effect is the dominate factor that impacts the final CD control. And some people tried to use the etch-based model to do optical proximity correction, but one drawback is the efficiency of the OPC running will be hurt. In this paper, we will demonstrate our study on the model based etch bias retarget for OPC.

  4. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  5. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  6. Model Based Filtered Backprojection Algorithm: A Tutorial

    PubMed Central

    2014-01-01

    Purpose People have been wandering for a long time whether a filtered backprojection (FBP) algorithm is able to incorporate measurement noise in image reconstruction. The purpose of this tutorial is to develop such an FBP algorithm that is able to minimize an objective function with an embedded noise model. Methods An objective function is first set up to model measurement noise and to enforce some constraints so that the resultant image has some pre-specified properties. An iterative algorithm is used to minimize the objective function, and then the result of the iterative algorithm is converted into the Fourier domain, which in turn leads to an FBP algorithm. The model based FBP algorithm is almost the same as the conventional FBP algorithm, except for the filtering step. Results The model based FBP algorithm has been applied to low-dose x-ray CT, nuclear medicine, and real-time MRI applications. Compared with the conventional FBP algorithm, the model based FBP algorithm is more effective in reducing noise. Even though an iterative algorithm can achieve the same noise-reducing performance, the model based FBP algorithm is much more computationally efficient. Conclusions The model based FBP algorithm is an efficient and effective image reconstruction tool. In many applications, it can replace the state-of-the-art iterative algorithms, which usually have a heavy computational cost. The model based FBP algorithm is linear and it has advantages over a nonlinear iterative algorithm in parametric image reconstruction and noise analysis. PMID:25574421

  7. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  8. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  9. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  10. Multimode model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.; Gregory, E.

    2016-02-01

    A newly-initiated research program for model-based defect characterization in CFRP composites is summarized. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing delamination and porosity. Forward predictions of measurement response are presented, as well as examples of model-based inversion of measured data for the estimation of defect parameters.

  11. 77 FR 2435 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-18

    ...- Free Treatment Under the Generalized System of Preferences and for Other Purposes Correction In... following correction: On page 407, the date following the proclamation number should read ``December...

  12. 78 FR 2193 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-10

    ... United States-Panama Trade Promotion Agreement and for Other Purposes Correction In Presidential document... correction: On page 66507, the proclamation identification heading on line one should read...

  13. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  14. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  15. Reducing Centroid Error Through Model-Based Noise Reduction

    NASA Technical Reports Server (NTRS)

    Lee, Shinhak

    2006-01-01

    A method of processing the digitized output of a charge-coupled device (CCD) image detector has been devised to enable reduction of the error in computed centroid of the image of a point source of light. The method involves model-based estimation of, and correction for, the contributions of bias and noise to the image data. The method could be used to advantage in any of a variety of applications in which there are requirements for measuring precise locations of, and/or precisely aiming optical instruments toward, point light sources. In the present method, prior to normal operations of the CCD, one measures the point-spread function (PSF) of the telescope or other optical system used to project images on the CCD. The PSF is used to construct a database of spot models representing the nominal CCD pixel outputs for a point light source projected onto the CCD at various positions incremented by small fractions of a pixel.

  16. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  17. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  18. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  19. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  20. Opinion dynamics model based on quantum formalism

    SciTech Connect

    Artawan, I. Nengah; Trisnawati, N. L. P.

    2016-03-11

    Opinion dynamics model based on quantum formalism is proposed. The core of the quantum formalism is on the half spin dynamics system. In this research the implicit time evolution operators are derived. The analogy between the model with Deffuant dan Sznajd models is discussed.

  1. Model-based Training of Situated Skills.

    ERIC Educational Resources Information Center

    Khan, Tariq M.; Brown, Keith

    2000-01-01

    Addresses areas of situated knowledge (metacognitive skills and affective skills) that have been ignored in intelligent computer-aided learning systems. Focuses on model-based reasoning, including contextualized and decontextualized knowledge, and examines an instructional method that supports situated knowledge by providing opportunities for…

  2. Model-based Processing of Microcantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Sinensky, A K; Lee, C L; Rudd, R E; Burnham, A K

    2005-04-27

    We have developed a model-based processor (MBP) for a microcantilever-array sensor to detect target species in solution. We perform a proof-of-concept experiment, fit model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest, averaged deflection data and multi-channel data. For this evaluation we extract model parameters via a model-based estimation, perform a Gauss-Markov simulation, design the optimal MBP and apply it to measured experimental data. The performance of the MBP in the multi-channel case is evaluated by comparison to a ''smoother'' (averager) typically used for microcantilever signal analysis. It is shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, apart from a correctable systematic bias error.

  3. Model-based image processing using snakes and mutual information

    NASA Astrophysics Data System (ADS)

    von Klinski, Sebastian; Derz, Claus; Weese, David; Tolxdorff, Thomas

    2000-06-01

    Any segmentation approach assumes certain knowledge concerning data modalities, relevant organs and their imaging characteristics. These assumptions are necessary for developing criteria by which to separate the organ in question from the surrounding tissue. Typical assumptions are that the organs have homogeneous gray-value characteristics (region growing, region merging, etc.), specific gray-value patterns (classification methods), continuous edges (edge-based approaches), smooth and strong edges (snake approaches), or any combination of these. In most cases, such assumptions are invalid, at least locally. Consequently, these approaches prove to be time consuming either in their parameterization or execution. Further, the low result quality makes post- processing necessary. Our aim was to develop a segmentation approach for large 3D data sets (e.g., CT and MRI) that requires a short interaction time and that can easily be adapted to different organs and data materials. This has been achieved by exploiting available knowledge about data material and organ topology using anatomical models that have been constructed from previously segmented data sets. In the first step, the user manually specifies the general context of the data material and specifies anatomical landmarks. Then this information is used to automatically select a corresponding reference model, which is geometrically adjusted to the current data set. In the third step, a model-based snake approach is applied to determine the correct segmentation of the organ in question. Analogously, this approach can be used for model-based interpolation and registration.

  4. [Tuberculosis in African children: epidemiologic, clinical and therapeutic aspects (corrected)].

    PubMed

    Anane, R

    2003-01-01

    In the last 10 years since world health assembly declared re-emergence of tuberculosis to be a worldwide emergency, most countries have initiated control strategies based on the recommendations of the World Health Organization. Implementation of national control programs has not only been encouraged but also become a necessity in the face of the constantly increasing number of cases and HIV epidemic. Diagnosis of tuberculosis in children is difficult and reliable estimation of prevalence is difficult. The purpose of this study was to analyze diagnostic criteria in 1128 children in Algiers, Algeria. Short-course chemotherapy was also evaluated. A prospective study in 733 children showed that 6-month regimes administered in the framework of a national program are effective and led to few complications. Short-course treatment also promotes better patient compliance. Consideration was also given to prophylactic treatment for contact children and adverse reactions to BCG.

  5. Integrated Image Reconstruction and Gradient Nonlinearity Correction

    PubMed Central

    Tao, Shengzhen; Trzasko, Joshua D.; Shu, Yunhong; Huston, John; Bernstein, Matt A.

    2014-01-01

    Purpose To describe a model-based reconstruction strategy for routine magnetic resonance imaging (MRI) that accounts for gradient nonlinearity (GNL) during rather than after transformation to the image domain, and demonstrate that this approach reduces the spatial resolution loss that occurs during strictly image-domain GNL-correction. Methods After reviewing conventional GNL-correction methods, we propose a generic signal model for GNL-affected MRI acquisitions, discuss how it incorporates into contemporary image reconstruction platforms, and describe efficient non-uniform fast Fourier transform (NUFFT)-based computational routines for these. The impact of GNL-correction on spatial resolution by the conventional and proposed approaches is investigated on phantom data acquired at varying offsets from gradient isocenter, as well as on fully-sampled and (retrospectively) undersampled in vivo acquisitions. Results Phantom results demonstrate that resolution loss that occurs during GNL-correction is significantly less for the proposed strategy than for the standard approach at distances >10 cm from isocenter with a 35 cm FOV gradient coil. The in vivo results suggest that the proposed strategy better preserves fine anatomical detail than retrospective GNL-correction while offering comparable geometric correction. Conclusion Accounting for GNL during image reconstruction allows geometric distortion to be corrected with less spatial resolution loss than is typically observed with the conventional image domain correction strategy. PMID:25298258

  6. TPX correction coil studies

    SciTech Connect

    Hanson, J.D.

    1994-11-03

    Error correction coils are planned for the TPX (Tokamak Plasma Experiment) in order to avoid error field induced locked modes and disruption. The FT (Fix Tokamak) code is used to evaluate the ability of these correction coils to remove islands caused by symmetry breaking magnetic field errors. The proposed correction coils are capable of correcting a variety of error fields.

  7. [Spigelian hernia: clinical, diagnostic and therapeutical aspects].

    PubMed

    Versaci, A; Rossitto, M; Centorrino, T; Barbera, A; Fonti, M T; Broccio, M; Ciccolo, A

    1998-01-01

    The Authors describing a case of Spigelian hernia observed point out clinical, diagnostic and therapeutic considerations about this rare pathology of abdominal wall. They specify the anatomic characteristics of the region and underline as any diagnostic difficulties are by passed by use of USG and TC imaging for formulation of correct preoperative diagnosis. They confirm as surgical treatment by a correct access isn't different by a normal hernioplasty and guarantee the long term surgical outcome.

  8. Model-based correction of diffraction effects of the virtual source element.

    PubMed

    Wennerström, Erik; Stepinski, Tadeusz

    2007-08-01

    A method for ultrasonic synthetic aperture imaging using finite-sized transducers is introduced that is based on a virtual source (VS) concept. In this setup, a focused transducer creates a VS element at its focal point that facilitates the use of synthetic aperture focusing technique (SAFT). It is shown that the performance of the VS method may be unsatisfactory due to the distortion introduced by the diffraction effects of the aperture used for creating the VS element. A solution to this problem is proposed that consists of replacing the classical SAFT by the extended synthetic aperature focusing technique (ESAFT) algorithm presented in our earlier works. In ESAFT, the full geometry of the VS is modeled, instead of applying the simplified point source approximation used when VS is combined with classical SAFT. The proposed method yields a substantial improvement in spatial resolution compared to that obtained using SAFT. Performance of the proposed algorithm is first demonstrated on simulated data, then verified on real data acquired with an array system.

  9. Technical Note: A model-based sinogram correction for beam hardening artifact reduction in CT.

    PubMed

    Lee, Sung Min; Seo, Jin Keun; Chung, Yong Eun; Baek, Jongduk; Park, Hyoung Suk

    2017-09-01

    This study aims to propose a physics-based method of reducing beam-hardening artifacts induced by high-attenuation materials such as metal stents or other metallic implants. The proposed approach consists of deriving a sinogram inconsistency formula representing the energy dependence of the attenuation coefficient of high-attenuation materials. This inconsistency formula more accurately represents the inconsistencies of the sinogram than that of a previously reported formula (called the MAC-BC method). This is achieved by considering the properties of the high-attenuation materials, which include the materials' shapes and locations and their effects on the incident X-ray spectrum, including their attenuation coefficients. Numerical simulation and phantom experiment demonstrate that the modeling error of MAC-BC method are nearly completely removed by means of the proposed method. The proposed method reduces beam-hardening artifacts arising from high-attenuation materials by relaxing the assumptions of the MAC-BC method. In doing so, it outperforms the original MAC-BC method. Further research is required to address other potential sources of metal artifacts, such as photon starvation, scattering, and noise. © 2017 American Association of Physicists in Medicine.

  10. Mitochondrial diseases: therapeutic approaches.

    PubMed

    DiMauro, Salvatore; Mancuso, Michelangelo

    2007-06-01

    Therapy of mitochondrial encephalomyopathies (defined restrictively as defects of the mitochondrial respiratory chain) is woefully inadequate, despite great progress in our understanding of the molecular bases of these disorders. In this review, we consider sequentially several different therapeutic approaches. Palliative therapy is dictated by good medical practice and includes anticonvulsant medication, control of endocrine dysfunction, and surgical procedures. Removal of noxious metabolites is centered on combating lactic acidosis, but extends to other metabolites. Attempts to bypass blocks in the respiratory chain by administration of electron acceptors have not been successful, but this may be amenable to genetic engineering. Administration of metabolites and cofactors is the mainstay of real-life therapy and is especially important in disorders due to primary deficiencies of specific compounds, such as carnitine or coenzyme Q10. There is increasing interest in the administration of reactive oxygen species scavengers both in primary mitochondrial diseases and in neurodegenerative diseases directly or indirectly related to mitochondrial dysfunction. Aerobic exercise and physical therapy prevent or correct deconditioning and improve exercise tolerance in patients with mitochondrial myopathies due to mitochondrial DNA (mtDNA) mutations. Gene therapy is a challenge because of polyplasmy and heteroplasmy, but interesting experimental approaches are being pursued and include, for example, decreasing the ratio of mutant to wild-type mitochondrial genomes (gene shifting), converting mutated mtDNA genes into normal nuclear DNA genes (allotopic expression), importing cognate genes from other species, or correcting mtDNA mutations with specific restriction endonucleases. Germline therapy raises ethical problems but is being considered for prevention of maternal transmission of mtDNA mutations. Preventive therapy through genetic counseling and prenatal diagnosis is

  11. Efficient Model-Based Diagnosis Engine

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  12. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  13. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  14. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  15. Model-based control of mechanical ventilation: design and clinical validation.

    PubMed

    Martinoni, E P; Pfister, Ch A; Stadler, K S; Schumacher, P M; Leibundgut, D; Bouillon, T; Böhlen, T; Zbinden, A M

    2004-06-01

    We developed a model-based control system using end-tidal carbon dioxide fraction (FE'(CO(2))) to adjust a ventilator during clinical anaesthesia. We studied 16 ASA I-II patients (mean age 38 (range 20-59) yr; weight 67 (54-87) kg) during i.v. anaesthesia for elective surgery. After periods of normal ventilation the patients were either hyper- or hypoventilated to assess precision and dynamic behaviour of the control system. These data were compared with a previous group where a fuzzy-logic controller had been used. Responses to different clinical events (invalid carbon dioxide measurement, limb tourniquet release, tube cuff leak, exhaustion of carbon dioxide absorbent, simulation of pulmonary embolism) were also noted. The model-based controller correctly maintained the setpoint. No significant difference was found for the static performance between the two controllers. The dynamic response of the model-based controller was more rapid (P<0.05). The mean rise time after a setpoint increase of 1 vol% was 313 (sd 90) s and 142 (17) s for fuzzy-logic and model-based control, respectively, and after a 1 vol% decrease was 355 (127) s and 177 (36) s, respectively. The new model-based controller had a consistent response to clinical artefacts. A model-based FE'(CO(2)) controller can be used in a clinical setting. It reacts appropriately to artefacts, and has a better dynamic response to setpoint changes than a previously described fuzzy-logic controller.

  16. TU-G-210-02: TRANS-FUSIMO - An Integrative Approach to Model-Based Treatment Planning of Liver FUS

    SciTech Connect

    Preusser, T.

    2015-06-15

    Modeling can play a vital role in predicting, optimizing and analyzing the results of therapeutic ultrasound treatments. Simulating the propagating acoustic beam in various targeted regions of the body allows for the prediction of the resulting power deposition and temperature profiles. In this session we will apply various modeling approaches to breast, abdominal organ and brain treatments. Of particular interest is the effectiveness of procedures for correcting for phase aberrations caused by intervening irregular tissues, such as the skull in transcranial applications or inhomogeneous breast tissues. Also described are methods to compensate for motion in targeted abdominal organs such as the liver or kidney. Douglas Christensen – Modeling for Breast and Brain HIFU Treatment Planning Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Learning Objectives: Understand the role of acoustic beam modeling for predicting the effectiveness of therapeutic ultrasound treatments. Apply acoustic modeling to specific breast, liver, kidney and transcranial anatomies. Determine how to obtain appropriate acoustic modeling parameters from clinical images. Understand the separate role of absorption and scattering in energy delivery to tissues. See how organ motion can be compensated for in ultrasound therapies. Compare simulated data with clinical temperature measurements in transcranial applications. Supported by NIH R01 HL172787 and R01 EB013433 (DC); EU Seventh Framework Programme (FP7/2007-2013) under 270186 (FUSIMO) and 611889 (TRANS-FUSIMO)(TP); and P01 CA159992, GE, FUSF and InSightec (UV)

  17. 75 FR 18747 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... Day: A National Day of Celebration of Greek and American Democracy, 2010 Correction In Presidential... correction: On page 15601, the first line of the heading should read ``Proclamation 8485 of March 24,...

  18. 77 FR 45469 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ... Respect to the Former Liberian Regime of Charles Taylor Correction In Presidential document 2012-17703 beginning on page 42415 in the issue of Wednesday, July 18, 2012, make the following correction: On...

  19. 78 FR 7255 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... Unobligated Funds Under the American Recovery and Reinvestment Act of 2009 Correction In Presidential document... correction: On page 70883, the document identification heading on line one should read ``Notice of...

  20. 75 FR 68413 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Correction In Presidential document 2010-27676 beginning on page 67019 in the issue of Monday, November 1, 2010, make the following correction: On page 67019, the Presidential Determination number should...

  1. 75 FR 1013 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-08

    ... Correction In Presidential document E9-31418 beginning on page 707 in the issue of Tuesday, January 5, 2010, make the following correction: On page 731, the date line below the President's signature should...

  2. 75 FR 68409 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Migration Needs Resulting From Flooding In Pakistan Correction In Presidential document 2010-27673 beginning on page 67015 in the issue of Monday, November 1, 2010, make the following correction: On page...

  3. 78 FR 73377 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-06

    ...--Continuation of U.S. Drug Interdiction Assistance to the Government of Colombia Correction In Presidential... correction: On page 51647, the heading of the document was omitted and should read ``Continuation of...

  4. 77 FR 60037 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... Commit, Threaten To Commit, or Support Terrorism Correction In Presidential document 2012-22710 beginning on page 56519 in the issue of Wednesday, September 12, 2012, make the following correction: On...

  5. 75 FR 68407 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Migration Needs Resulting from Violence in Kyrgyzstan Correction In Presidential document 2010-27672 beginning on page 67013 in the issue of Monday, November 1, 2010, make the following correction: On...

  6. Model-based Tomographic Reconstruction Literature Search

    SciTech Connect

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  7. Model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.

    2017-02-01

    Work is reported on model-based defect characterization in CFRP composites. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing multi-ply impact-induced delamination, with application in this paper focusing on ultrasound. A companion paper in these proceedings summarizes corresponding activity in thermography. Inversion of ultrasound data is demonstrated showing the quantitative extraction of damage properties.

  8. Model-based multiple patterning layout decomposition

    NASA Astrophysics Data System (ADS)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  9. Model-based neuroimaging for cognitive computing.

    PubMed

    Poznanski, Roman R

    2009-09-01

    The continuity of the mind is suggested to mean the continuous spatiotemporal dynamics arising from the electrochemical signature of the neocortex: (i) globally through volume transmission in the gray matter as fields of neural activity, and (ii) locally through extrasynaptic signaling between fine distal dendrites of cortical neurons. If the continuity of dynamical systems across spatiotemporal scales defines a stream of consciousness then intentional metarepresentations as templates of dynamic continuity allow qualia to be semantically mapped during neuroimaging of specific cognitive tasks. When interfaced with a computer, such model-based neuroimaging requiring new mathematics of the brain will begin to decipher higher cognitive operations not possible with existing brain-machine interfaces.

  10. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  11. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  12. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  13. Research in Correctional Rehabilitation.

    ERIC Educational Resources Information Center

    Rehabilitation Services Administration (DHEW), Washington, DC.

    Forty-three leaders in corrections and rehabilitation participated in the seminar planned to provide an indication of the status of research in correctional rehabilitation. Papers include: (1) "Program Trends in Correctional Rehabilitation" by John P. Conrad, (2) "Federal Offenders Rahabilitation Program" by Percy B. Bell and Merlyn Mathews, (3)…

  14. Teaching Politically Correct Language

    ERIC Educational Resources Information Center

    Tsehelska, Maryna

    2006-01-01

    This article argues that teaching politically correct language to English learners provides them with important information and opportunities to be exposed to cultural issues. The author offers a brief review of how political correctness became an issue and how being politically correct influences the use of language. The article then presents…

  15. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  16. Lower-order effects adjustment in quantitative traits model-based multifactor dimensionality reduction.

    PubMed

    Mahachie John, Jestinah M; Cattaert, Tom; Lishout, François Van; Gusareva, Elena S; Steen, Kristel Van

    2012-01-01

    Identifying gene-gene interactions or gene-environment interactions in studies of human complex diseases remains a big challenge in genetic epidemiology. An additional challenge, often forgotten, is to account for important lower-order genetic effects. These may hamper the identification of genuine epistasis. If lower-order genetic effects contribute to the genetic variance of a trait, identified statistical interactions may simply be due to a signal boost of these effects. In this study, we restrict attention to quantitative traits and bi-allelic SNPs as genetic markers. Moreover, our interaction study focuses on 2-way SNP-SNP interactions. Via simulations, we assess the performance of different corrective measures for lower-order genetic effects in Model-Based Multifactor Dimensionality Reduction epistasis detection, using additive and co-dominant coding schemes. Performance is evaluated in terms of power and familywise error rate. Our simulations indicate that empirical power estimates are reduced with correction of lower-order effects, likewise familywise error rates. Easy-to-use automatic SNP selection procedures, SNP selection based on "top" findings, or SNP selection based on p-value criterion for interesting main effects result in reduced power but also almost zero false positive rates. Always accounting for main effects in the SNP-SNP pair under investigation during Model-Based Multifactor Dimensionality Reduction analysis adequately controls false positive epistasis findings. This is particularly true when adopting a co-dominant corrective coding scheme. In conclusion, automatic search procedures to identify lower-order effects to correct for during epistasis screening should be avoided. The same is true for procedures that adjust for lower-order effects prior to Model-Based Multifactor Dimensionality Reduction and involve using residuals as the new trait. We advocate using "on-the-fly" lower-order effects adjusting when screening for SNP-SNP interactions

  17. Feature-driven model-based segmentation

    NASA Astrophysics Data System (ADS)

    Qazi, Arish A.; Kim, John; Jaffray, David A.; Pekar, Vladimir

    2011-03-01

    The accurate delineation of anatomical structures is required in many medical image analysis applications. One example is radiation therapy planning (RTP), where traditional manual delineation is tedious, labor intensive, and can require hours of clinician's valuable time. Majority of automated segmentation methods in RTP belong to either model-based or atlas-based approaches. One substantial limitation of model-based segmentation is that its accuracy may be restricted by the uncertainties in image content, specifically when segmenting low-contrast anatomical structures, e.g. soft tissue organs in computed tomography images. In this paper, we introduce a non-parametric feature enhancement filter which replaces raw intensity image data by a high level probabilistic map which guides the deformable model to reliably segment low-contrast regions. The method is evaluated by segmenting the submandibular and parotid glands in the head and neck region and comparing the results to manual segmentations in terms of the volume overlap. Quantitative results show that we are in overall good agreement with expert segmentations, achieving volume overlap of up to 80%. Qualitatively, we demonstrate that we are able to segment low-contrast regions, which otherwise are difficult to delineate with deformable models relying on distinct object boundaries from the original image data.

  18. Sandboxes for Model-Based Inquiry

    NASA Astrophysics Data System (ADS)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  19. Model-based Processing of Micro-cantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Lee, C L; Rudd, R E; Burnham, A K

    2004-11-17

    We develop a model-based processor (MBP) for a micro-cantilever array sensor to detect target species in solution. After discussing the generalized framework for this problem, we develop the specific model used in this study. We perform a proof-of-concept experiment, fit the model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest: (1) averaged deflection data, and (2) multi-channel data. In both cases the evaluation proceeds by first performing a model-based parameter estimation to extract the model parameters, next performing a Gauss-Markov simulation, designing the optimal MBP and finally applying it to measured experimental data. The simulation is used to evaluate the performance of the MBP in the multi-channel case and compare it to a ''smoother'' (''averager'') typically used in this application. It was shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, though it includes a correctable systematic bias error. The project's primary accomplishment was the successful application of model-based processing to signals from micro-cantilever arrays: 40-60 dB improvement vs. the smoother algorithm was demonstrated. This result was achieved through the development of appropriate mathematical descriptions for the chemical and mechanical phenomena, and incorporation of these descriptions directly into the model-based signal processor. A significant challenge was the development of the framework which would maximize the usefulness of the signal processing algorithms while ensuring the accuracy of the mathematical description of the chemical-mechanical signal. Experimentally, the difficulty was to identify and characterize the non

  20. [Therapeutic strategy for different types of epicanthus].

    PubMed

    Gaofeng, Li; Jun, Tan; Zihan, Wu; Wei, Ding; Huawei, Ouyang; Fan, Zhang; Mingcan, Luo

    2015-11-01

    To explore the reasonable therapeutic strategy for different types of epicanthus. Patients with epicanthus were classificated according to the shape, extent and inner canthal distance and treated with different methods appropriately. Modified asymmetric Z plasty with two curve method was used in lower eyelid type epicanthus, inner canthus type epicanthus and severe upper eyelid type epicanthus. Moderate upper epicanthus underwent '-' shape method. Mild Upper epicanthus in two conditions which underwent nasal augumentation and double eyelid formation with normal inner canthal distance need no correction surgery. The other mild epicanthus underwent '-' shape method. A total of 66 cases underwent the classification and the appropriate treatment. All wounds healed well. During 3 to 12 months follow-up period, all epicanthus were corrected completely with natural contour and unconspicuous scars. All patients were satisfied with the results. Classification of epicanthus hosed on the shape, extent and inner canthal distance and correction with appropriate methods is a reasonable therapeutic strategy.

  1. Treatment Ideology and Correctional Bureaucracy: A Study of Organizational Change.

    ERIC Educational Resources Information Center

    Martinson, Robert Magnus

    A study was made of organizational change induced by a staff training project in six correctional institutions for youth in the California system, which is currently engaged in introducing "therapeutic community" into correctional facilities. Part I described and evaluated a federally financed training project. The "resource…

  2. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  3. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  4. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  5. Model-based vision for space applications

    NASA Technical Reports Server (NTRS)

    Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald

    1992-01-01

    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.

  6. A Cognitive Model Based on Neuromodulated Plasticity

    PubMed Central

    Ruan, Xiaogang

    2016-01-01

    Associative learning, including classical conditioning and operant conditioning, is regarded as the most fundamental type of learning for animals and human beings. Many models have been proposed surrounding classical conditioning or operant conditioning. However, a unified and integrated model to explain the two types of conditioning is much less studied. Here, a model based on neuromodulated synaptic plasticity is presented. The model is bioinspired including multistored memory module and simulated VTA dopaminergic neurons to produce reward signal. The synaptic weights are modified according to the reward signal, which simulates the change of associative strengths in associative learning. The experiment results in real robots prove the suitability and validity of the proposed model. PMID:27872638

  7. Model-based reconfiguration: Diagnosis and recovery

    NASA Technical Reports Server (NTRS)

    Crow, Judy; Rushby, John

    1994-01-01

    We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.

  8. Fast Algorithms for Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan

    2005-01-01

    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  9. Model-based ocean acoustic passive localization

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-01-24

    The detection, localization and classification of acoustic sources (targets) in a hostile ocean environment is a difficult problem -- especially in light of the improved design of modern submarines and the continual improvement in quieting technology. Further the advent of more and more diesel-powered vessels makes the detection problem even more formidable than ever before. It has recently been recognized that the incorporation of a mathematical model that accurately represents the phenomenology under investigation can vastly improve the performance of any processor, assuming, of course, that the model is accurate. Therefore, it is necessary to incorporate more knowledge about the ocean environment into detection and localization algorithms in order to enhance the overall signal-to-noise ratios and improve performance. An alternative methodology to matched-field/matched-mode processing is the so-called model-based processor which is based on a state-space representation of the normal-mode propagation model. If state-space solutions can be accomplished, then many of the current ocean acoustic processing problems can be analyzed and solved using this framework to analyze performance results based on firm statistical and system theoretic grounds. The model-based approach, is (simply) ``incorporating mathematical models of both physical phenomenology and the measurement processes including noise into the processor to extract the desired information.`` In this application, we seek techniques to incorporate the: (1) ocean acoustic propagation model; (2) sensor array measurement model; and (3) noise models (ambient, shipping, surface and measurement) into a processor to solve the associated localization/detection problems.

  10. An application of model-based reasoning to accounting systems

    SciTech Connect

    Nado, R.; Chams, M.; Delisio, J.; Hamscher, W.

    1996-12-31

    An important problem faced by auditors is gauging how much reliance can be placed on the accounting systems that process millions of transactions to produce the numbers summarized in a company`s financial statements. Accounting systems contain internal controls, procedures designed to detect and correct errors and irregularities that may occur in the processing of transactions. In a complex accounting system, it can be an extremely difficult task for the auditor to anticipate the possible errors that can occur and to evaluate the effectiveness of the controls at detecting them. An accurate analysis must take into account the unique features of each company`s business processes. To cope with this complexity and variability, the Comet system applies a model-based reasoning approach to the analysis of accounting systems and their controls. An auditor uses Comet to create a hierarchical flowchart model that describes the intended processing of business transactions by an accounting system and the operation of its controls. Comet uses the constructed model to automatically analyze the effectiveness of the controls in detecting potential errors. Price Waterhouse auditors have used Comet on a variety of real audits in several countries around the world.

  11. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  12. A Generative Control Capability for a Model-based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Nayak, P. Pandurang

    1997-01-01

    This paper describes Burton, a core element of a new generation of goal-directed model-based autonomous executives. This executive makes extensive use of component-based declarative models to analyze novel situations and generate novel control actions both at the goal and hardware levels. It uses an extremely efficient online propositional inference engine to efficiently determine likely states consistent with current observations and optimal target states that achieve high level goals. It incorporates a flexible generative control sequencing algorithm within the reactive loop to bridge the gap between current and target states. The system is able to detect and avoid damaging and irreversible situations, After every control action it uses its model and sensors to detect anomalous situations and immediately take corrective action. Efficiency is achieved through a series of model compilation and online policy construction methods, and by exploiting general conventions of hardware design that permit a divide and conquer approach to planning. The paper presents a formal characterization of Burton's capability, develops efficient algorithms, and reports on experience with the implementation in the domain of spacecraft autonomy. Burton is being incorporated as one of the key elements of the Remote Agent core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millenium program.

  13. Development of explicit diffraction corrections for absolute measurements of acoustic nonlinearity parameters in the quasilinear regime.

    PubMed

    Jeong, Hyunjo; Zhang, Shuzeng; Cho, Sungjong; Li, Xiongbing

    2016-08-01

    In absolute measurements of acoustic nonlinearity parameters, amplitudes of harmonics must be corrected for diffraction effects. In this study, we develop explicit multi-Gaussian beam (MGB) model-based diffraction corrections for the first three harmonics in weakly nonlinear, axisymmetric sound beams. The effects of making diffraction corrections on nonlinearity parameter estimation are investigated by defining "total diffraction correction (TDC)". The results demonstrate that TDC cannot be neglected even for harmonic generation experiments in the nearfield region.

  14. Corrective Primary Impression Technique

    PubMed Central

    Fernandes, Aquaviva; Dua, Neha; Herekar, Manisha

    2010-01-01

    The article describes a simple, quick and corrective technique for making the preliminary impression. It records the extensions better as compared to the impressions made using only impression compound. This technique is accurate and gives properly extended custom tray. Any deficiencies seen in the compound primary impression are corrected using this technique hence, this technique is called as a “corrective primary impression technique”. PMID:20502648

  15. Request for Correction 10003

    EPA Pesticide Factsheets

    Letter from Jeff Rush requesting rescinding and correction online and printed information regarding alleged greenhouse gas emissions reductions resulting from beneficial use of coal combustion waste products.

  16. 78 FR 55169 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-10

    ... Commodities and Services From Any Agency of the United States Government to the Syrian Opposition Coalition (SOC) and the Syrian Opposition's Supreme Military Council (SMC) Correction In Presidential...

  17. 3-D model-based vehicle tracking.

    PubMed

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  18. Model-Based Estimation of Knee Stiffness

    PubMed Central

    Pfeifer, Serge; Vallery, Heike; Hardegger, Michael; Riener, Robert; Perreault, Eric J.

    2013-01-01

    During natural locomotion, the stiffness of the human knee is modulated continuously and subconsciously according to the demands of activity and terrain. Given modern actuator technology, powered transfemoral prostheses could theoretically provide a similar degree of sophistication and function. However, experimentally quantifying knee stiffness modulation during natural gait is challenging. Alternatively, joint stiffness could be estimated in a less disruptive manner using electromyography (EMG) combined with kinetic and kinematic measurements to estimate muscle force, together with models that relate muscle force to stiffness. Here we present the first step in that process, where we develop such an approach and evaluate it in isometric conditions, where experimental measurements are more feasible. Our EMG-guided modeling approach allows us to consider conditions with antagonistic muscle activation, a phenomenon commonly observed in physiological gait. Our validation shows that model-based estimates of knee joint stiffness coincide well with experimental data obtained using conventional perturbation techniques. We conclude that knee stiffness can be accurately estimated in isometric conditions without applying perturbations, which presents an important step towards our ultimate goal of quantifying knee stiffness during gait. PMID:22801482

  19. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  20. Model-based target and background characterization

    NASA Astrophysics Data System (ADS)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  1. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  2. Model-Based Wavefront Control for CCAT

    NASA Technical Reports Server (NTRS)

    Redding, David; Lou, John Z.; Kissil, Andy; Bradford, Matt; Padin, Steve; Woody, David

    2011-01-01

    The 25-m aperture CCAT submillimeter-wave telescope will have a primary mirror that is divided into 162 individual segments, each of which is provided with 3 positioning actuators. CCAT will be equipped with innovative Imaging Displacement Sensors (IDS) inexpensive optical edge sensors capable of accurately measuring all segment relative motions. These measurements are used in a Kalman-filter-based Optical State Estimator to estimate wavefront errors, permitting use of a minimum-wavefront controller without direct wavefront measurement. This controller corrects the optical impact of errors in 6 degrees of freedom per segment, including lateral translations of the segments, using only the 3 actuated degrees of freedom per segment. The global motions of the Primary and Secondary Mirrors are not measured by the edge sensors. These are controlled using a gravity-sag look-up table. Predicted performance is illustrated by simulated response to errors such as gravity sag.

  3. The fast correction coil feedback control system

    SciTech Connect

    Coffield, F.; Caporaso, G.; Zentler, J.M.

    1989-01-01

    A model-based feedback control system has been developed to correct beam displacement errors in the Advanced Test Accelerator (ATA) electron beam accelerator. The feedback control system drives an X/Y dipole steering system that has a 40-MHz bandwidth and can produce {+-}300-Gauss-cm dipole fields. A simulator was used to develop the control algorithm and to quantify the expected performance in the presence of beam position measurement noise and accelerator timing jitter. The major problem to date has been protecting the amplifiers from the voltage that is inductively coupled to the steering bars by the beam. 3 refs., 8 figs.

  4. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  5. Statistical appearance models based on probabilistic correspondences.

    PubMed

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2017-04-01

    Model-based image analysis is indispensable in medical image processing. One key aspect of building statistical shape and appearance models is the determination of one-to-one correspondences in the training data set. At the same time, the identification of these correspondences is the most challenging part of such methods. In our earlier work, we developed an alternative method using correspondence probabilities instead of exact one-to-one correspondences for a statistical shape model (Hufnagel et al., 2008). In this work, a new approach for statistical appearance models without one-to-one correspondences is proposed. A sparse image representation is used to build a model that combines point position and appearance information at the same time. Probabilistic correspondences between the derived multi-dimensional feature vectors are used to omit the need for extensive preprocessing of finding landmarks and correspondences as well as to reduce the dependence of the generated model on the landmark positions. Model generation and model fitting can now be expressed by optimizing a single global criterion derived from a maximum a-posteriori (MAP) approach with respect to model parameters that directly affect both shape and appearance of the considered objects inside the images. The proposed approach describes statistical appearance modeling in a concise and flexible mathematical framework. Besides eliminating the demand for costly correspondence determination, the method allows for additional constraints as topological regularity in the modeling process. In the evaluation the model was applied for segmentation and landmark identification in hand X-ray images. The results demonstrate the feasibility of the model to detect hand contours as well as the positions of the joints between finger bones for unseen test images. Further, we evaluated the model on brain data of stroke patients to show the ability of the proposed model to handle partially corrupted data and to

  6. Suspected myelinolysis following rapid correction of hyponatremia in a dog.

    PubMed

    Churcher, R K; Watson, A D; Eaton, A

    1999-01-01

    A dog developed signs of neurological dysfunction five days after rapid correction of severe electrolyte derangements, including hyponatremia, caused by gastrointestinal parasitism (i.e., trichuriasis). History, laboratory findings, and onset of neurological signs following correction of hyponatremia led to a diagnosis of myelinolysis. Myelinolysis is a noninflammatory, demyelinating brain disease caused by sudden, upward osmotic shifts in central nervous system plasma, often a result of rapid correction of chronic hyponatremia. The pathogenesis is complex, but recovery is possible. Iatrogenic damage due to myelinolysis can be avoided by adherence to therapeutic guidelines for correction of chronic hyponatremia.

  7. Laser correcting mirror

    DOEpatents

    Sawicki, Richard H.

    1994-01-01

    An improved laser correction mirror (10) for correcting aberrations in a laser beam wavefront having a rectangular mirror body (12) with a plurality of legs (14, 16, 18, 20, 22, 24, 26, 28) arranged into opposing pairs (34, 36, 38, 40) along the long sides (30, 32) of the mirror body (12). Vector force pairs (49, 50, 52, 54) are applied by adjustment mechanisms (42, 44, 46, 48) between members of the opposing pairs (34, 36, 38, 40) for bending a reflective surface 13 of the mirror body 12 into a shape defining a function which can be used to correct for comatic aberrations.

  8. Trends in Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Smith, Ralph W.

    1995-01-01

    Discusses the implications of the rapid, dramatic changes taking place in therapeutic recreation for individuals with physical disabilities. The article notes the impact of changes in managed care, examines programming trends in therapeutic recreation (adventure/outdoor education, competitive sports, handcycling, health enhancement activities, and…

  9. Chicanoizing the Therapeutic Community

    ERIC Educational Resources Information Center

    Aron, William S.; And Others

    1974-01-01

    Focusing on the drug addiction problem and its antecedent conditions in a Chicano population, the article examines several therapeutic interventions suggested by these conditions and indicates how they might be incorporated into a drug addiction Therapeutic Community treatment program designed to meet the needs of Chicano drug addicts. (Author/NQ)

  10. Therapeutic Recreation Practicum Manual.

    ERIC Educational Resources Information Center

    Schneegas, Kay

    This manual provides information on the practicum program offered by Moraine Valley Community College (MVCC) for students in its therapeutic recreation program. Sections I and II outline the rationale and goals for providing practical, on-the-job work experiences for therapeutic recreation students. Section III specifies MVCC's responsibilities…

  11. Chicanoizing the Therapeutic Community

    ERIC Educational Resources Information Center

    Aron, William S.; And Others

    1974-01-01

    Focusing on the drug addiction problem and its antecedent conditions in a Chicano population, the article examines several therapeutic interventions suggested by these conditions and indicates how they might be incorporated into a drug addiction Therapeutic Community treatment program designed to meet the needs of Chicano drug addicts. (Author/NQ)

  12. Impact of Therapeutic Camping

    ERIC Educational Resources Information Center

    Shniderman, Craig M.

    1974-01-01

    There has been little interest in, and only slight illumination of, the impact of therapeutic camping for emotionally disturbed children. This study seeks to validate the belief that camping is therapeutic. Subjects were 52 boys, 5 to 11 1/2 years of age. Results support the hypothesis. (Author/HMV)

  13. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  14. Corrected Age for Preemies

    MedlinePlus

    ... Breastfeeding Crying & Colic Diapers & Clothing Feeding & Nutrition Preemie Sleep Teething & Tooth Care Toddler Preschool Gradeschool Teen Young Adult Healthy Children > Ages & Stages > Baby > Preemie > Corrected Age For Preemies Ages & Stages ...

  15. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  16. A Nonhydrostatic Model Based On A New Approach

    NASA Astrophysics Data System (ADS)

    Janjic, Z. I.

    Considerable experience with nonhydrostatic mo dels has been accumulated on the scales of convective clouds and storms. However, numerical weather prediction (NWP) deals with motions on a much wider range of temporal and spatial scales. Thus, difficulties that may not be significant on the small scales, may become important in NWP applications. Having in mind these considerations, a new approach has been proposed and applied in developing nonhydrostatic models intended for NWP applications. Namely, instead of extending the cloud models to synoptic scales, the hydrostatic approximation is relaxed in a hydrostatic NWP model. In this way the model validity is extended to nonhydrostatic motions, and at the same time favorable features of the hydrostatic formulation are preserved. In order to apply this approach, the system of nonhydrostatic equations is split into two parts: (a) the part that corresponds to the hydrostatic system, except for corrections due to vertical acceleration, and (b) the system of equations that allows computation of the corrections appearing in the first system. This procedure does not require any additional approximation. In the model, "isotropic" horizontal finite differencing is employed that conserves a number of basic and derived dynamical and quadratic quantities. The hybrid pressure-sigma vertical coordinate has been chosen as the primary option. The forward-backward scheme is used for horizontally propagating fast waves, and an implicit scheme is used for vertically propagating sound waves. The Adams- Bashforth scheme is applied for the advection of the basic dynamical variables and for the Coriolis terms. In real data runs, the nonhydrostatic dynamics does not require extra computational boundary conditions at the top. The philosophy of the physical package and possible future developments of physical parameterizations are also reviewed. A two-dimensional model based on the described approach successfully reproduced classical

  17. [Diagnostic-therapeutic approach for retroperitoneal tumors].

    PubMed

    Cariati, A

    1993-12-01

    After a careful review of the Literature, diagnostic and therapeutic strategies for Primary Retroperitoneal Tumours (PRT) are reported. The Author analyzes the experience of the Institute of Clinica Chirurgica "R" (Chief: Prof. E. Tosatti) as well as that of Anatomia Chirurgica (Chief: Prof. E. Cariati),--University of Genoa--in the management of PRT, stressing the importance of preoperative staging for a correct surgical approach.

  18. Model-based wavefront sensorless adaptive optics system for large aberrations and extended objects.

    PubMed

    Yang, Huizhen; Soloviev, Oleg; Verhaegen, Michel

    2015-09-21

    A model-based wavefront sensorless (WFSless) adaptive optics (AO) system with a 61-element deformable mirror is simulated to correct the imaging of a turbulence-degraded extended object. A fast closed-loop control algorithm, which is based on the linear relation between the mean square of the aberration gradients and the second moment of the image intensity distribution, is used to generate the control signals for the actuators of the deformable mirror (DM). The restoration capability and the convergence rate of the AO system are investigated with different turbulence strength wave-front aberrations. Simulation results show the model-based WFSless AO system can restore those images degraded by different turbulence strengths successfully and obtain the correction very close to the achievable capability of the given DM. Compared with the ideal correction of 61-element DM, the averaged relative error of RMS value is 6%. The convergence rate of AO system is independent of the turbulence strength and only depends on the number of actuators of DM.

  19. Quantum Error Correction

    NASA Astrophysics Data System (ADS)

    Lidar, Daniel A.; Brun, Todd A.

    2013-09-01

    Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and

  20. Adaptable DC offset correction

    NASA Technical Reports Server (NTRS)

    Golusky, John M. (Inventor); Muldoon, Kelly P. (Inventor)

    2009-01-01

    Methods and systems for adaptable DC offset correction are provided. An exemplary adaptable DC offset correction system evaluates an incoming baseband signal to determine an appropriate DC offset removal scheme; removes a DC offset from the incoming baseband signal based on the appropriate DC offset scheme in response to the evaluated incoming baseband signal; and outputs a reduced DC baseband signal in response to the DC offset removed from the incoming baseband signal.

  1. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  2. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  3. MACE: model based analysis of ChIP-exo

    PubMed Central

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J.; Zimmermann, Michael T.; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T.; Huang, Haojie; Wilson, Michael D.; Kocher, Jean-Pierre A.; Li, Wei

    2014-01-01

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  4. A satellite and model based flood inundation climatology of Australia

    NASA Astrophysics Data System (ADS)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  5. MACE: model based analysis of ChIP-exo.

    PubMed

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J; Zimmermann, Michael T; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T; Huang, Haojie; Wilson, Michael D; Kocher, Jean-Pierre A; Li, Wei

    2014-11-10

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Limitations of Non Model-Based Recognition Schemes

    DTIC Science & Technology

    1991-05-01

    general classes: model-based vs. non model-based schemes. In this paper we establish some limitation on the class of non model-based recognition schemes. A ...perfect, but is allowed to make mistakes and misidentify each object from a substantial fraction of viewing directions. It follows that every...symmetric objects) a nontrivial recognition scheme exists. We define the notion of a discrimination power of a consistent recognition function for a class

  7. Cognitive components underpinning the development of model-based learning.

    PubMed

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2016-10-29

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning.

  8. Therapeutic gene editing: delivery and regulatory perspectives

    PubMed Central

    Shim, Gayong; Kim, Dongyoon; Park, Gyu Thae; Jin, Hyerim; Suh, Soo-Kyung; Oh, Yu-Kyoung

    2017-01-01

    Gene-editing technology is an emerging therapeutic modality for manipulating the eukaryotic genome by using target-sequence-specific engineered nucleases. Because of the exceptional advantages that gene-editing technology offers in facilitating the accurate correction of sequences in a genome, gene editing-based therapy is being aggressively developed as a next-generation therapeutic approach to treat a wide range of diseases. However, strategies for precise engineering and delivery of gene-editing nucleases, including zinc finger nucleases, transcription activator-like effector nuclease, and CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats-associated nuclease Cas9), present major obstacles to the development of gene-editing therapies, as with other gene-targeting therapeutics. Currently, viral and non-viral vectors are being studied for the delivery of these nucleases into cells in the form of DNA, mRNA, or proteins. Clinical trials are already ongoing, and in vivo studies are actively investigating the applicability of CRISPR/Cas9 techniques. However, the concept of correcting the genome poses major concerns from a regulatory perspective, especially in terms of safety. This review addresses current research trends and delivery strategies for gene editing-based therapeutics in non-clinical and clinical settings and considers the associated regulatory issues. PMID:28392568

  9. Therapeutic gene editing: delivery and regulatory perspectives.

    PubMed

    Shim, Gayong; Kim, Dongyoon; Park, Gyu Thae; Jin, Hyerim; Suh, Soo-Kyung; Oh, Yu-Kyoung

    2017-04-10

    Gene-editing technology is an emerging therapeutic modality for manipulating the eukaryotic genome by using target-sequence-specific engineered nucleases. Because of the exceptional advantages that gene-editing technology offers in facilitating the accurate correction of sequences in a genome, gene editing-based therapy is being aggressively developed as a next-generation therapeutic approach to treat a wide range of diseases. However, strategies for precise engineering and delivery of gene-editing nucleases, including zinc finger nucleases, transcription activator-like effector nuclease, and CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats-associated nuclease Cas9), present major obstacles to the development of gene-editing therapies, as with other gene-targeting therapeutics. Currently, viral and non-viral vectors are being studied for the delivery of these nucleases into cells in the form of DNA, mRNA, or proteins. Clinical trials are already ongoing, and in vivo studies are actively investigating the applicability of CRISPR/Cas9 techniques. However, the concept of correcting the genome poses major concerns from a regulatory perspective, especially in terms of safety. This review addresses current research trends and delivery strategies for gene editing-based therapeutics in non-clinical and clinical settings and considers the associated regulatory issues.

  10. Mars 2020 Model Based Systems Engineering Pilot

    NASA Technical Reports Server (NTRS)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  11. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  12. Biomimetic particles as therapeutics.

    PubMed

    Meyer, Randall A; Sunshine, Joel C; Green, Jordan J

    2015-09-01

    In recent years, there have been major advances in the development of novel nanoparticle- and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health.

  13. Biomimetic Particles as Therapeutics

    PubMed Central

    Green, Jordan J.

    2015-01-01

    In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289

  14. Peteye detection and correction

    NASA Astrophysics Data System (ADS)

    Yen, Jonathan; Luo, Huitao; Tretter, Daniel

    2007-01-01

    Redeyes are caused by the camera flash light reflecting off the retina. Peteyes refer to similar artifacts in the eyes of other mammals caused by camera flash. In this paper we present a peteye removal algorithm for detecting and correcting peteye artifacts in digital images. Peteye removal for animals is significantly more difficult than redeye removal for humans, because peteyes can be any of a variety of colors, and human face detection cannot be used to localize the animal eyes. In many animals, including dogs and cats, the retina has a special reflective layer that can cause a variety of peteye colors, depending on the animal's breed, age, or fur color, etc. This makes the peteye correction more challenging. We have developed a semi-automatic algorithm for peteye removal that can detect peteyes based on the cursor position provided by the user and correct them by neutralizing the colors with glare reduction and glint retention.

  15. Phaeochromocytoma [corrected] crisis.

    PubMed

    Whitelaw, B C; Prague, J K; Mustafa, O G; Schulte, K-M; Hopkins, P A; Gilbert, J A; McGregor, A M; Aylwin, S J B

    2014-01-01

    Phaeochromocytoma [corrected] crisis is an endocrine emergency associated with significant mortality. There is little published guidance on the management of phaeochromocytoma [corrected] crisis. This clinical practice update summarizes the relevant published literature, including a detailed review of cases published in the past 5 years, and a proposed classification system. We review the recommended management of phaeochromocytoma [corrected] crisis including the use of alpha-blockade, which is strongly associated with survival of a crisis. Mechanical circulatory supportive therapy (including intra-aortic balloon pump or extra-corporeal membrane oxygenation) is strongly recommended for patients with sustained hypotension. Surgical intervention should be deferred until medical stabilization is achieved. © 2013 John Wiley & Sons Ltd.

  16. Anxiety and therapeutic touch.

    PubMed

    Olson, M; Sneed, N

    1995-01-01

    This four-group, repeated-measures experimental design divided 40 healthy professional caregivers/students into high- and low-anxiety groups and further into "therapeutic touch" and comparison groups. The effectiveness of the use of therapeutic touch in reducing anxiety was evaluated, as were the methodologies used. Three self-report measures of anxiety (Profile of Mood States, Spielberger's State/Trait Anxiety Inventory, and visual analogue scales) were evaluated for equivalence and concurrent validity to determine their potential for use in future studies. The correlations among these instruments were highly significant. The small sample size prevented differences between groups from reaching statistical significance, but the reduction of anxiety in the high-anxiety group was greater for those who had received therapeutic touch than for those who did not. Using variability data, the sample size necessary to find statistically significant differences between those who had therapeutic touch and those who did not was determined.

  17. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  18. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  19. Assessment of Energy Efficient and Model Based Control

    DTIC Science & Technology

    2017-06-15

    ARL-TR-8042 ● JUNE 2017 US Army Research Laboratory Assessment of Energy -Efficient and Model- Based Control by Craig Lennon...originator. ARL-TR-8042 ● JUNE 2017 US Army Research Laboratory Assessment of Energy -Efficient and Model- Based Control by Craig...

  20. A Model-Based Process for Translating Test Programs.

    DTIC Science & Technology

    1996-09-13

    first language , converting the extracted test strategy into an asymmetric dependency model, converting the dependency model into a model based test strategy, extracting code segments from the existing test program, translating the extracted code segments into the second language, and merging the model based test strategy and the translated code segments into a new test program in the second

  1. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  2. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  3. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  4. Model based design introduction: modeling game controllers to microprocessor architectures

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  5. The Effect of Modeling Based Science Education on Critical Thinking

    ERIC Educational Resources Information Center

    Bati, Kaan; Kaptan, Fitnat

    2015-01-01

    In this study to what degree the modeling based science education can influence the development of the critical thinking skills of the students was investigated. The research was based on pre-test-post-test quasi-experimental design with control group. The Modeling Based Science Education Program which was prepared with the purpose of exploring…

  6. Correction coil cable

    DOEpatents

    Wang, S.T.

    1994-11-01

    A wire cable assembly adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies for the Superconducting Super Collider. The correction coil cables have wires collected in wire array with a center rib sandwiched therebetween to form a core assembly. The core assembly is surrounded by an assembly housing having an inner spiral wrap and a counter wound outer spiral wrap. An alternate embodiment of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable on a particle tube in a particle tube assembly. 7 figs.

  7. Target mass corrections revisited

    SciTech Connect

    Steffens, F.M.; Melnitchouk, W.

    2006-05-15

    We propose a new implementation of target mass corrections to nucleon structure functions which, unlike existing treatments, has the correct kinematic threshold behavior at finite Q{sup 2} in the x{yields}1 limit. We illustrate the differences between the new approach and existing prescriptions by considering specific examples for the F{sub 2} and F{sub L} structure functions, and discuss the broader implications of our results, which call into question the notion of universal parton distribution at finite Q{sup 2}.

  8. Target Mass Corrections Revisited

    SciTech Connect

    W. Melnitchouk; F. Steffens

    2006-03-07

    We propose a new implementation of target mass corrections to nucleon structure functions which, unlike existing treatments, has the correct kinematic threshold behavior at finite Q{sup 2} in the x {yields} 1 limit. We illustrate the differences between the new approach and existing prescriptions by considering specific examples for the F{sub 2} and F{sub L} structure functions, and discuss the broader implications of our results, which call into question the notion of universal parton distribution at finite Q{sup 2}.

  9. Corrective midfoot osteotomies.

    PubMed

    Stapleton, John J; DiDomenico, Lawrence A; Zgonis, Thomas

    2008-10-01

    Corrective midfoot osteotomies involve complete separation of the forefoot and hindfoot through the level of the midfoot, followed by uni-, bi-, or triplanar realignment and arthrodesis. This technique can be performed through various approaches; however, in the high-risk patient, percutaneous and minimum incision techniques are necessary to limit the potential of developing soft tissue injury. These master level techniques require extensive surgical experience and detailed knowledge of lower extremity biomechanics. The authors discuss preoperative clinical and radiographic evaluation, specific operative techniques used, and postoperative management for the high-risk patient undergoing corrective midfoot osteotomy.

  10. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Optical measurements of range and elevation angle are distorted by the earth's atmosphere. High precision refraction correction equations are presented which are ideally suited for surveying because their inputs are optically measured range and optically measured elevation angle. The outputs are true straight line range and true geometric elevation angle. The 'short distances' used in surveying allow the calculations of true range and true elevation angle to be quickly made using a programmable pocket calculator. Topics covered include the spherical form of Snell's Law; ray path equations; and integrating the equations. Short-, medium-, and long-range refraction corrections are presented in tables.

  11. Correction of ocular dystopia.

    PubMed

    Janecka, I P

    1996-04-01

    The purpose of this study was to examine results with elective surgical correction of enophthalmos. The study was a retrospective assessment in a university-based referral practice. A consecutive sample of 10 patients who developed ocular dystopia following orbital trauma was examined. The main outcome measures were a subjective evaluation by patients and objective measurements of patients' eye position. The intervention was three-dimensional orbital reconstruction with titanium plates. It is concluded that satisfactory correction of enophthalmos and ocular dystopia can be achieved with elective surgery using titanium plates. In addition, intraoperative measurements of eye position in three planes increases the precision of surgery.

  12. Correction to ATel 10782

    NASA Astrophysics Data System (ADS)

    Zhang, Jujia

    2017-09-01

    I report a correction to the spectroscopic classification of the optical transients announced in ATEL #10782. In the main text of the telegram, the date of observation should be UT 2017 Sep. 25.6, which was written as UT 2017 Sep. 26.6 in the original report. I apologize for any confusion caused by this typo error.

  13. Errors and Their Corrections

    ERIC Educational Resources Information Center

    Joosten, Albert Max

    2016-01-01

    "Our primary concern is not that the child learns to do something without mistakes. Our real concern is that the child does what he needs, with interest." The reaction of so many adults to the mistakes of children is to correct, immediately and directly, says Joosten. To truly aid the child in development, we must learn to control our…

  14. New Directions in Corrections.

    ERIC Educational Resources Information Center

    McKee, John M.

    A picture of the American prison situation in the past and in its present changing form is presented. The object of the correctional community is becoming more and more that of successfully reintegrating the ex-offender into the social community from which he has been separated. It is predicted that within the next five years: (1) Every state will…

  15. Correction to ATel 10681

    NASA Astrophysics Data System (ADS)

    Wang, Xiaofeng

    2017-08-01

    We report a correction to the spectroscopic classification of two optical transients announced in ATel #10681. In the main text of the telegram, SN 2017giq and MASTER OT J033744.97+723159.0 should be classified as type Ic and type IIb supernovae, respectively, which were reversed in the original report. We apologize for any confusion caused by this typo error.

  16. Rethinking Correctional Staff Development.

    ERIC Educational Resources Information Center

    Williams, David C.

    There have been enduring conflicts in correctional institutions between personnel charged with rehabilitative duties and those who oversee authority. It is only within the past few years that realistic communication between these groups has been tolerated. The same period of time has been characterized by the infusion of training and staff…

  17. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  18. Spelling Words Correctly.

    ERIC Educational Resources Information Center

    Ediger, Marlow

    Traditional methods of teaching spelling emphasized that pupils might write each new spelling word correctly and repeatedly from a weekly list in the spelling textbook. Some weaknesses in this approach are that rote learning is being stressed without emphasizing application of what has been learned, and that there is nothing which relates the…

  19. Thermodynamically Correct Bioavailability Estimations

    DTIC Science & Technology

    1992-04-30

    6448 I 1. SWPPUMENTA* NOTIS lIa. OISTUAMJTiOAVAILAIILTY STATIMENT 121 OT REbT ostwosCo z I Approved for public release; distribution unlimited... research is to develop thermodynamically correct bioavailability estimations using chromatographic stationary phases as a model of the "interphase

  20. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  1. Holographic Phase Correction.

    DTIC Science & Technology

    1987-06-01

    aberrated wavefront. With this in mind , the following example was considered. 3.2 REPLAY EFFICIENCY - AN EXAMPLE This example represents the phase...practical points to bear in mind when considering the phase correction - in particular, the flatness of the hologram input and output surfaces, and the...DOCUMENT CONTROL SHEET Overall securty clasification of sheet UNCLASSIFIED

  2. Issues in Correctional Training and Casework. Correctional Monograph.

    ERIC Educational Resources Information Center

    Wolford, Bruce I., Ed.; Lawrenz, Pam, Ed.

    The eight papers contained in this monograph were drawn from two national meetings on correctional training and casework. Titles and authors are: "The Challenge of Professionalism in Correctional Training" (Michael J. Gilbert); "A New Perspective in Correctional Training" (Jack Lewis); "Reasonable Expectations in Correctional Officer Training:…

  3. Atmospheric Correction Algorithm for Hyperspectral Imagery

    SciTech Connect

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolute calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.

  4. Model-Based Reasoning in Humans Becomes Automatic with Training

    PubMed Central

    Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J.

    2015-01-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load—a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  5. Model-Based Reasoning in Humans Becomes Automatic with Training.

    PubMed

    Economides, Marcos; Kurth-Nelson, Zeb; Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J

    2015-09-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  6. Model-based HSF using by target point control function

    NASA Astrophysics Data System (ADS)

    Kim, Seongjin; Do, Munhoe; An, Yongbae; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu

    2015-03-01

    As the technology node shrinks, ArF Immersion reaches the limitation of wafer patterning, furthermore weak point during the mask processing is generated easily. In order to make strong patterning result, the design house conducts lithography rule checking (LRC). Despite LRC processing, we found the weak point at the verification stage of optical proximity correction (OPC). It is called the hot spot point (HSP). In order to fix the HSP, many studies have been performed. One of the most general hot spot fixing (HSF) methods is that the modification bias which consists of "Line-Resizing" and "Space-Resizing". In addition to the general rule biasing method, resolution enhancement techniques (RET) which includes the inverse lithography technology (ILT) and model based assist feature (MBAF) have been adapted to remove the hot spot and to maximize the process window. If HSP is found during OPC verification stage, various HSF methods can be applied. However, HSF process added on regular OPC procedure makes OPC turn-around time (TAT) increased. In this paper, we introduce a new HSF method that is able to make OPC TAT shorter than the common HSF method. The new HSF method consists of two concepts. The first one is that OPC target point is controlled to fix HSP. Here, the target point should be moved to optimum position at where the edge placement error (EPE) can be 0 at critical points. Many parameters such as a model accuracy or an OPC recipe become the cause of larger EPE. The second one includes controlling of model offset error through target point adjustment. Figure 1 shows the case EPE is not 0. It means that the simulation contour was not targeted well after OPC process. On the other hand, Figure 2 shows the target point is moved -2.5nm by using target point control function. As a result, simulation contour is matched to the original layout. This function can be powerfully adapted to OPC procedure of memory and logic devices.

  7. Overcoming limitations of model-based diagnostic reasoning systems

    NASA Technical Reports Server (NTRS)

    Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.

    1989-01-01

    The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.

  8. Overcoming limitations of model-based diagnostic reasoning systems

    NASA Technical Reports Server (NTRS)

    Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.

    1989-01-01

    The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.

  9. Using Ground Spectral Irradiance for Model Correction of AVIRIS Data

    NASA Technical Reports Server (NTRS)

    Goetz, Alexander F. H.; Heidebrecht, Kathleen B.; Kindel, Bruce; Boardman, Joseph W.

    1998-01-01

    Over the last decade a series of techniques has been developed to correct hyperspectral imaging sensor data to apparent surface reflectance. The techniques range from the empirical line method that makes use of ground target measurements to model-based methods such as ATREM that derive parameters from the data themselves to convert radiance to reflectance, and combinations of the above. Here we describe a technique that combines ground measurements of spectral irradiance with existing radiative transfer models to derive the model equivalent of an empirical line method correction without the need for uniform ground targets of different reflectance.

  10. Correction coil cable

    DOEpatents

    Wang, Sou-Tien

    1994-11-01

    A wire cable assembly (10, 310) adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies (532) for the superconducting super collider. The correction coil cables (10, 310) have wires (14, 314) collected in wire arrays (12, 312) with a center rib (16, 316) sandwiched therebetween to form a core assembly (18, 318 ). The core assembly (18, 318) is surrounded by an assembly housing (20, 320) having an inner spiral wrap (22, 322) and a counter wound outer spiral wrap (24, 324). An alternate embodiment (410) of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable (410) on a particle tube (733) in a particle tube assembly (732).

  11. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  12. Voltage correction power flow

    SciTech Connect

    Rajicic, D.; Ackovski, R.; Taleski, R. . Dept. of Electrical Engineering)

    1994-04-01

    A method for power flow solution of weakly meshed distribution and transmission networks is presented. It is based on oriented ordering of network elements. That allows an efficient construction of the loop impedance matrix and rational organization of the processes such as: power summation (backward sweep), current summation (backward sweep) and node voltage calculation (forward sweep). The first step of the algorithm is calculation of node voltages on the radial part of the network. The second step is calculation of the breakpoint currents. Then, the procedure continues with the first step, which is preceded by voltage correction. It is illustrated that using voltage correction approach, the iterative process of weakly meshed network voltage calculation is faster and more reliable.

  13. Pluristem Therapeutics, Inc.

    PubMed

    Prather, William

    2008-01-01

    Pluristem Therapeutics, Inc., based in Haifa, Israel, is a regenerative, biotherapeutics Company dedicated to the commercialization of nonpersonalized (allogeneic) cell therapy products. The Company is expanding noncontroversial placental-derived mesenchymal stem cells via a proprietary 3D process, named PluriX, into therapeutics for a variety of degenerative, malignant and autoimmune disorders. Pluristem will be conducting Phase I trials in the USA with its first product, PLX-I, which addresses the global shortfall of matched tissue for bone marrow transplantation by improving the engraftment of hematopoietic stem cells contained in umbilical cord blood.

  14. DELIVERY OF THERAPEUTIC PROTEINS

    PubMed Central

    Pisal, Dipak S.; Kosloski, Matthew P.; Balu-Iyer, Sathy V.

    2009-01-01

    The safety and efficacy of protein therapeutics are limited by three interrelated pharmaceutical issues, in vitro and in vivo instability, immunogenicity and shorter half-lives. Novel drug modifications for overcoming these issues are under investigation and include covalent attachment of poly(ethylene glycol) (PEG), polysialic acid, or glycolic acid, as well as developing new formulations containing nanoparticulate or colloidal systems (e.g. liposomes, polymeric microspheres, polymeric nanoparticles). Such strategies have the potential to develop as next generation protein therapeutics. This review includes a general discussion on these delivery approaches. PMID:20049941

  15. Lymphedema and Therapeutic Lymphangiogenesis

    PubMed Central

    Nakagami, Hironori; Kaneda, Yasufumi; Morishita, Ryuichi

    2013-01-01

    Lymphedema is a disorder of the lymphatic vascular system characterized by impaired lymphatic return and swelling of the extremities. Lymphedema is divided into primary and secondary forms based on the underlying etiology. Despite substantial advances in both surgical and conservative techniques, therapeutic options for the management of lymphedema are limited. Although rarely lethal, lymphedema is a disfiguring and disabling condition with an associated decrease in the quality of life. The recent impressive expansion of knowledge on the molecular mechanisms governing lymphangiogenesis provides new possibilities for the treatment of lymphedema. This review highlights the lymphatic biology, the pathophysiology of lymphedema, and the therapeutic lymphangiogenesis using hepatocyte growth factor. PMID:24222916

  16. Therapeutics for cognitive aging

    PubMed Central

    Shineman, Diana W.; Salthouse, Timothy A.; Launer, Lenore J.; Hof, Patrick R.; Bartzokis, George; Kleiman, Robin; Luine, Victoria; Buccafusco, Jerry J.; Small, Gary W.; Aisen, Paul S.; Lowe, David A.; Fillit, Howard M.

    2011-01-01

    This review summarizes the scientific talks presented at the conference “Therapeutics for Cognitive Aging,” hosted by the New York Academy of Sciences and the Alzheimer’s Drug Discovery Foundation on May 15, 2009. Attended by scientists from industry and academia, as well as by a number of lay people—approximately 200 in all—the conference specifically tackled the many aspects of developing therapeutic interventions for cognitive impairment. Discussion also focused on how to define cognitive aging and whether it should be considered a treatable, tractable disease. PMID:20392284

  17. Therapeutics for cognitive aging.

    PubMed

    Shineman, Diana W; Salthouse, Timothy A; Launer, Lenore J; Hof, Patrick R; Bartzokis, George; Kleiman, Robin; Luine, Victoria; Buccafusco, Jerry J; Small, Gary W; Aisen, Paul S; Lowe, David A; Fillit, Howard M

    2010-04-01

    This review summarizes the scientific talks presented at the conference "Therapeutics for Cognitive Aging," hosted by the New York Academy of Sciences and the Alzheimer's Drug Discovery Foundation on May 15, 2009. Attended by scientists from industry and academia, as well as by a number of lay people-approximately 200 in all-the conference specifically tackled the many aspects of developing therapeutic interventions for cognitive impairment. Discussion also focused on how to define cognitive aging and whether it should be considered a treatable, tractable disease.

  18. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  19. Correcting Duporcq's theorem☆

    PubMed Central

    Nawratil, Georg

    2014-01-01

    In 1898, Ernest Duporcq stated a famous theorem about rigid-body motions with spherical trajectories, without giving a rigorous proof. Today, this theorem is again of interest, as it is strongly connected with the topic of self-motions of planar Stewart–Gough platforms. We discuss Duporcq's theorem from this point of view and demonstrate that it is not correct. Moreover, we also present a revised version of this theorem. PMID:25540467

  20. Clinical Utility and Safety of a Model-Based Patient-Tailored Dose of Vancomycin in Neonates.

    PubMed

    Leroux, Stéphanie; Jacqz-Aigrain, Evelyne; Biran, Valérie; Lopez, Emmanuel; Madeleneau, Doriane; Wallon, Camille; Zana-Taïeb, Elodie; Virlouvet, Anne-Laure; Rioualen, Stéphane; Zhao, Wei

    2016-04-01

    Pharmacokinetic modeling has often been applied to evaluate vancomycin pharmacokinetics in neonates. However, clinical application of the model-based personalized vancomycin therapy is still limited. The objective of the present study was to evaluate the clinical utility and safety of a model-based patient-tailored dose of vancomycin in neonates. A model-based vancomycin dosing calculator, developed from a population pharmacokinetic study, has been integrated into the routine clinical care in 3 neonatal intensive care units (Robert Debré, Cochin Port Royal, and Clocheville hospitals) between 2012 and 2014. The target attainment rate, defined as the percentage of patients with a first therapeutic drug monitoring serum vancomycin concentration achieving the target window of 15 to 25 mg/liter, was selected as an endpoint for evaluating the clinical utility. The safety evaluation was focused on nephrotoxicity. The clinical application of the model-based patient-tailored dose of vancomycin has been demonstrated in 190 neonates. The mean (standard deviation) gestational and postnatal ages of the study population were 31.1 (4.9) weeks and 16.7 (21.7) days, respectively. The target attainment rate increased from 41% to 72% without any case of vancomycin-related nephrotoxicity. This proof-of-concept study provides evidence for integrating model-based antimicrobial therapy in neonatal routine care.

  1. Model Based Iterative Reconstruction for Bright Field Electron Tomography (Postprint)

    DTIC Science & Technology

    2013-02-01

    Reconstruction Technique ( SIRT ) are applied to the data. Model based iterative reconstruction (MBIR) provides a powerful framework for tomographic...the reconstruction when the typical algorithms such as Filtered Back Projection (FBP) and Simultaneous Iterative Reconstruction Technique ( SIRT ) are

  2. Model-Based Engineering for Supply Chain Risk Management

    DTIC Science & Technology

    2015-09-30

    Model-Based Engineering for Supply Chain Risk Management Dan Shoemaker, Ph.D. University of Detroit Mercy Carol Woody, Ph.D. Carnegie Mellon...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis

  3. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  4. Reduced model-based decision-making in schizophrenia.

    PubMed

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  5. Measuring Therapeutic Effectiveness.

    ERIC Educational Resources Information Center

    Callister, Sheldon L.

    In the recent past, there has been a great deal of effort directed toward developing techniques for documenting therapeutic outcome. Funding sources and the general public seem to be demanding more meaningful data which indicate, in a clear manner, whether or not the services they are paying for are of value. Mental health centers, like other…

  6. Developing Therapeutic Listening

    ERIC Educational Resources Information Center

    Lee, Billy; Prior, Seamus

    2013-01-01

    We present an experience-near account of the development of therapeutic listening in first year counselling students. A phenomenological approach was employed to articulate the trainees' lived experiences of their learning. Six students who had just completed a one-year postgraduate certificate in counselling skills were interviewed and the…

  7. Antibody Therapeutics in Oncology

    PubMed Central

    Wold, Erik D; Smider, Vaughn V; Felding, Brunhilde H

    2016-01-01

    One of the newer classes of targeted cancer therapeutics is monoclonal antibodies. Monoclonal antibody therapeutics are a successful and rapidly expanding drug class due to their high specificity, activity, favourable pharmacokinetics, and standardized manufacturing processes. Antibodies are capable of recruiting the immune system to attack cancer cells through complement-dependent cytotoxicity or antibody dependent cellular cytotoxicity. In an ideal scenario the initial tumor cell destruction induced by administration of a therapeutic antibody can result in uptake of tumor associated antigens by antigen-presenting cells, establishing a prolonged memory effect. Mechanisms of direct tumor cell killing by antibodies include antibody recognition of cell surface bound enzymes to neutralize enzyme activity and signaling, or induction of receptor agonist or antagonist activity. Both approaches result in cellular apoptosis. In another and very direct approach, antibodies are used to deliver drugs to target cells and cause cell death. Such antibody drug conjugates (ADCs) direct cytotoxic compounds to tumor cells, after selective binding to cell surface antigens, internalization, and intracellular drug release. Efficacy and safety of ADCs for cancer therapy has recently been greatly advanced based on innovative approaches for site-specific drug conjugation to the antibody structure. This technology enabled rational optimization of function and pharmacokinetics of the resulting conjugates, and is now beginning to yield therapeutics with defined, uniform molecular characteristics, and unprecedented promise to advance cancer treatment. PMID:27081677

  8. Therapeutic Homework Assignments.

    ERIC Educational Resources Information Center

    Corbishley, M. Anne; Yost, Elizabeth B.

    1985-01-01

    Outlines guidelines to follow in assigning therapeutic homework to students, focusing on student preparation, including behavior change, choosing and devising assignments, and checking on homework. With modification, counseling homework can be used with all students who are beyond second or third grade. (BL)

  9. Hair regrowth. Therapeutic agents.

    PubMed

    Shapiro, J; Price, V H

    1998-04-01

    Today there are new classes of hair growth promotors with proven efficacy. This article reviews the current state of the art agents for treatment of two of the most common forms of hair loss encountered in clinical practice, androgenetic alopecia and alopecia areata. Current therapeutic strategies are based on recent advances in the understanding of disordered hair growth. Practical treatment protocols are presented.

  10. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  11. Developing Therapeutic Listening

    ERIC Educational Resources Information Center

    Lee, Billy; Prior, Seamus

    2013-01-01

    We present an experience-near account of the development of therapeutic listening in first year counselling students. A phenomenological approach was employed to articulate the trainees' lived experiences of their learning. Six students who had just completed a one-year postgraduate certificate in counselling skills were interviewed and the…

  12. Therapeutic HPV DNA vaccines

    PubMed Central

    Lin, Ken; Roosinovich, Elena; Ma, Barbara; Hung, Chien-Fu

    2010-01-01

    It is now well established that most cervical cancers are causally associated with HPV infection. This realization has led to efforts to control HPV-associated malignancy through prevention or treatment of HPV infection. Currently, commercially available HPV vaccines are not designed to control established HPV infection and associated premalignant and malignant lesions. To treat and eradicate pre-existing HPV infections and associated lesions which remain prevalent in the U.S. and worldwide, effective therapeutic HPV vaccines are needed. DNA vaccination has emerged as a particularly promising form of therapeutic HPV vaccines due to its safety, stability and ability to induce antigen-specific immunity. This review focuses on improving the potency of therapeutic HPV vaccines through modification of dendritic cells (DCs) by [1] increasing the number of antigen-expressing/antigen-loaded DCs, [2] improving HPV antigen expression, processing and presentation in DCs, and [3] enhancing DC and T cell interaction. Continued improvement in therapeutic HPV DNA vaccines may ultimately lead to an effective DNA vaccine for the treatment of HPV-associated malignancies. PMID:20066511

  13. Rethinking therapeutic action.

    PubMed

    Gabbard, Glen O; Westen, Drew

    2003-08-01

    Like other core psychoanalytic constructs, the theory of therapeutic action is currently in flux, as theorists of differing persuasions propose different mechanisms. In this article, the authors attempt to integrate developments within and without psychoanalysis to provide a working model of the multifaceted processes involved in producing change in psychoanalysis and psychoanalytic psychotherapy. A theory of therapeutic action must describe both what changes (the aims of treatment) and what strategies are likely to be useful in facilitating those changes (technique). The authors believe that single-mechanism theories of therapeutic action, no matter how complex, are unlikely to prove useful at this point because of the variety of targets of change and the variety of methods useful in effecting change in those targets (such as techniques aimed at altering different kinds of conscious and unconscious processes). Interventions that facilitate change may be classified into one of three categories: those that foster insight, those that make use of various mutative aspects of the treatment relationship and a variety of secondary strategies that can be of tremendous importance. They propose that, in all forms of psychoanalytic treatment, we would be more accurate to speak of the therapeutic actions, rather than action.

  14. Measuring Therapeutic Effectiveness.

    ERIC Educational Resources Information Center

    Callister, Sheldon L.

    In the recent past, there has been a great deal of effort directed toward developing techniques for documenting therapeutic outcome. Funding sources and the general public seem to be demanding more meaningful data which indicate, in a clear manner, whether or not the services they are paying for are of value. Mental health centers, like other…

  15. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  16. Carbohydrates in therapeutics.

    PubMed

    Kilcoyne, Michelle; Joshi, Lokesh

    2007-07-01

    Awareness of the importance of carbohydrates in living systems and medicine is growing due to the increasing understanding of their biological and pharmacological relevance. Carbohydrates are ubiquitous and perform a wide array of biological roles. Carbohydrate-based or -modified therapeutics are used extensively in cardiovascular and hematological treatments ranging from inflammatory diseases and anti-thrombotic treatments to wound healing. Heparin is a well-known and widely used example of a carbohydrate-based drug but will not be discussed as it has been extensively reviewed. We will detail carbohydrate-based and -modified therapeutics, both those that are currently marketed or in various stages of clinical trials and those that are potential therapeutics based on promising preclinical investigations. Carbohydrate-based therapeutics include polysaccharide and oligosaccharide anti-inflammatory, anti-coagulant and anti-thrombotic agents from natural and synthetic sources, some as an alternative to heparin and others which were designed based on known structure-functional relationships. Some of these compounds have multiple biological effects, showing anti-adhesive, anti-HIV and anti-arthrithic activities. Small molecules, derivatives or mimetics of complement inhibitors, are detailed for use in limiting ischemia/ reperfusion injuries. Monosaccharides, both natural and synthetic, have been investigated for their in vivo anti-inflammatory and cardioprotective properties. Modification by glycosylation of natural products, or glycosylation-mimicking modification, has a significant effect on the parent molecule including increased plasma half-life and refining or increasing desired functions. It is hoped that this review will highlight the vast therapeutic potential of these natural bioactive molecules.

  17. Experimental repetitive quantum error correction.

    PubMed

    Schindler, Philipp; Barreiro, Julio T; Monz, Thomas; Nebendahl, Volckmar; Nigg, Daniel; Chwalla, Michael; Hennrich, Markus; Blatt, Rainer

    2011-05-27

    The computational potential of a quantum processor can only be unleashed if errors during a quantum computation can be controlled and corrected for. Quantum error correction works if imperfections of quantum gate operations and measurements are below a certain threshold and corrections can be applied repeatedly. We implement multiple quantum error correction cycles for phase-flip errors on qubits encoded with trapped ions. Errors are corrected by a quantum-feedback algorithm using high-fidelity gate operations and a reset technique for the auxiliary qubits. Up to three consecutive correction cycles are realized, and the behavior of the algorithm for different noise environments is analyzed.

  18. Biasing errors and corrections

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1991-01-01

    The dependence of laser velocimeter measurement rate on flow velocity is discussed. Investigations outlining that any dependence is purely statistical, and is nonstationary both spatially and temporally, are described. Main conclusions drawn are that the times between successive particle arrivals should be routinely measured and the calculation of the velocity data rate correlation coefficient should be performed to determine if a dependency exists. If none is found, accept the data ensemble as an independent sample of the flow. If a dependency is found, the data should be modified to obtain an independent sample. Universal correcting procedures should never be applied because their underlying assumptions are not valid.

  19. [Correctional health care].

    PubMed

    Fix, Michel

    2013-01-01

    Court decisions taking away someone's freedom by requiring them to serve a jail sentence should not deny them access to the same health care available to free citizens in full compliance with patient confidentiality. Health institutions, responsible for administering somatic care, offer a comprehensive response to the medical needs of those under justice control, both in jails and conventional care units. For a physician, working in the correctional setting implies accepting its constraints, and violence, and protecting and enforcing fundamental rights, as well as rights to dignity, confidential care and freedom to accept or refuse a treatment.

  20. [Correction of paralytic lagophthalmos].

    PubMed

    Iskusnykh, N S; Grusha, Y O

    2015-01-01

    Current options for correction of paralytic lagophthalmos are either temporary (external eyelid weight placement, hyaluronic acid gel or botulinum toxin A injection) or permanent (various procedures for narrowing of the palpebral fissure, upper eyelid weights or spring implantation). Neuroplastic surgery (cross-facial nerve grafting, nerve anastomoses) and muscle transposition surgery is not effective enough. The majority of elderly and medically compromised patients should not be considered for such complicated and long procedures. Upper eyelid weight implantation thus appears the most reliable and simple treatment.

  1. [Acute dysphagia of oncological origin. Therapeutic management].

    PubMed

    Arias, F; Manterola, A; Domínguez, M A; Martínez, E; Villafranca, E; Romero, P; Vera, R

    2004-01-01

    Dysphagia is one of the most frequent syndromes in patients with tumours of the head and neck, and the oesophagus. This can be the initial symptom or, more frequently, related to the oncological treatment. We review the most important therapeutic and physio-pathological aspects of acute dysphagia of oncological origin. Deglutition is a complex process in which numerous muscular-skeletal structures intervene under the neurological control of different cranial nerves. The complex neuro-muscular coordination needed for a correct deglutition can be affected by numerous situations, both from the effect of the tumours and from their treatment, basically surgery or radiotherapy. In conclusion, it can be affirmed that for a suitable treatment of oncological dysphagia, a correct initial evaluation and an active treatment are required, since not only the patient's quality of life but, on numerous occasions, the possibility of continuing the treatment and thus maintaining the possibilities of a cure depend on control of the dysphagia.

  2. [New therapeutic developments in cystic fibrosis].

    PubMed

    Bui, S; Macey, J; Fayon, M; Bihouée, T; Burgel, P-R; Colomb, V; Corvol, H; Durieu, I; Hubert, D; Marguet, C; Mas, E; Munck, A; Murris-Espin, M; Reix, P; Sermet-Gaudelus, I

    2016-12-01

    Since the discovery of chloride secretion by the Cystic Fibrosis Transport regulator CFTR in 1983, and CFTR gene in 1989, knowledge about CFTR synthesis, maturation, intracellular transfer and function has dramatically expanded. These discoveries have led to the distribution of CF mutations into 6 classes with different pathophysiological mechanisms. In this article we will explore the state of art on CFTR synthesis and its chloride secretion function. We will then explore the consequences of the 6 classes of mutations on CFTR protein function and we will describe the new therapeutic developments aiming at correcting these defects. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  3. Attenuation correction in molecular fluorescence imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yang, Bin; Tunnell, James W.

    2016-03-01

    Fluorescence-guided surgery has demonstrated more complete tumor resections in both preclinical models and clinical applications. However, intraoperative fluorescence-based imaging can be challenging due to attenuation of the fluorescence by intrinsic tissue scattering and absorption. Removing attenuation in fluorescence imaging is critical in many applications. We have developed both a model based approach and an experimental approach to retrieve attenuation corrected fluorescence based on spatial frequency domain imaging (SFDI). In the model based approach, we extended an attenuation correction model initially developed for point measurement into wide-field imaging with SFDI. To achieve attenuation correction, tissue optical properties were evaluated at both excitation and emission wavelengths, which were later applied in the model. In an in-vitro phantom study, we achieved a relative flat intensity profile over entire absorption range compared to over 80% drop at the highest absorption level before correction. Similar performance was also observed in an ex-vivo tissue study. However, lengthy image acquisition and image processing make this method ideal for static imaging instead of video-rate imaging. To achieve video-rate correction, we developed an experimental approach to reduce absorption by limiting the imaging depth using a high spatial frequency pattern. The absorption reduced fluorescence image was obtained by performing a simple demodulation. The in-vitro phantom study suggested an approximate 20% intensity drop at the highest absorption level compared to over 70% intensity drop before correction. This approach enabled video-rate attenuation corrected imaging at 19 fps, making this technique viable for clinical image guided surgery.

  4. Using Online Annotations to Support Error Correction and Corrective Feedback

    ERIC Educational Resources Information Center

    Yeh, Shiou-Wen; Lo, Jia-Jiunn

    2009-01-01

    Giving feedback on second language (L2) writing is a challenging task. This research proposed an interactive environment for error correction and corrective feedback. First, we developed an online corrective feedback and error analysis system called "Online Annotator for EFL Writing". The system consisted of five facilities: Document Maker,…

  5. Mental Health in Corrections: An Overview for Correctional Staff.

    ERIC Educational Resources Information Center

    Sowers, Wesley; Thompson, Kenneth; Mullins, Stephen

    This volume is designed to provide corrections practitioners with basic staff training on the needs of those with mental illness and impairments in our correctional systems. Chapter titles are: (1) "Mental Illness in the Correctional Setting"; (2) "Substance Use Disorders"; (3) "Problems with Mood"; (4) "Problems…

  6. Using Online Annotations to Support Error Correction and Corrective Feedback

    ERIC Educational Resources Information Center

    Yeh, Shiou-Wen; Lo, Jia-Jiunn

    2009-01-01

    Giving feedback on second language (L2) writing is a challenging task. This research proposed an interactive environment for error correction and corrective feedback. First, we developed an online corrective feedback and error analysis system called "Online Annotator for EFL Writing". The system consisted of five facilities: Document Maker,…

  7. Mental Health in Corrections: An Overview for Correctional Staff.

    ERIC Educational Resources Information Center

    Sowers, Wesley; Thompson, Kenneth; Mullins, Stephen

    This volume is designed to provide corrections practitioners with basic staff training on the needs of those with mental illness and impairments in our correctional systems. Chapter titles are: (1) "Mental Illness in the Correctional Setting"; (2) "Substance Use Disorders"; (3) "Problems with Mood"; (4) "Problems…

  8. Smooth eigenvalue correction

    NASA Astrophysics Data System (ADS)

    Hendrikse, Anne; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-12-01

    Second-order statistics play an important role in data modeling. Nowadays, there is a tendency toward measuring more signals with higher resolution (e.g., high-resolution video), causing a rapid increase of dimensionality of the measured samples, while the number of samples remains more or less the same. As a result the eigenvalue estimates are significantly biased as described by the Marčenko Pastur equation for the limit of both the number of samples and their dimensionality going to infinity. By introducing a smoothness factor, we show that the Marčenko Pastur equation can be used in practical situations where both the number of samples and their dimensionality remain finite. Based on this result we derive methods, one already known and one new to our knowledge, to estimate the sample eigenvalues when the population eigenvalues are known. However, usually the sample eigenvalues are known and the population eigenvalues are required. We therefore applied one of the these methods in a feedback loop, resulting in an eigenvalue bias correction method. We compare this eigenvalue correction method with the state-of-the-art methods and show that our method outperforms other methods particularly in real-life situations often encountered in biometrics: underdetermined configurations, high-dimensional configurations, and configurations where the eigenvalues are exponentially distributed.

  9. Worldwide radiosonde temperature corrections

    SciTech Connect

    Luers, J.; Eskridge, R.

    1997-11-01

    Detailed heat transfer analyses have been performed on ten of the world`s most commonly used radiosondes from 1960 to present. These radiosondes are the USA VIZ and Space Data, the Vaisala RS-80, RS-185/21, and RS12/15, the Japanese RS2-80, Russian MARS, RKZ, and A22, and the Chinese GZZ. The temperature error of each radiosonde has been calculated as a function of altitude and the sonde and environmental parameters that influence its magnitude. Computer models have been developed that allow the correction of temperature data from each sonde as a function of these parameters. Recommendations are made concerning the use of data from each of the radiosondes for climate studies. For some radiosondes, nighttime data requires no corrections. Other radiosondes require that day and daytime data is not feasible because parameters of significance, such as balloon rise rate, are not retrievable. The results from this study provide essential information for anyone attempting to perform climate studies using radiosonde data. 6 refs., 1 tab.

  10. Turbulence compressibility corrections

    NASA Technical Reports Server (NTRS)

    Coakley, T. J.; Horstman, C. C.; Marvin, J. G.; Viegas, J. R.; Bardina, J. E.; Huang, P. G.; Kussoy, M. I.

    1994-01-01

    The basic objective of this research was to identify, develop and recommend turbulence models which could be incorporated into CFD codes used in the design of the National AeroSpace Plane vehicles. To accomplish this goal, a combined effort consisting of experimental and theoretical phases was undertaken. The experimental phase consisted of a literature survey to collect and assess a database of well documented experimental flows, with emphasis on high speed or hypersonic flows, which could be used to validate turbulence models. Since it was anticipated that this database would be incomplete and would need supplementing, additional experiments in the NASA Ames 3.5-Foot Hypersonic Wind Tunnel (HWT) were also undertaken. The theoretical phase consisted of identifying promising turbulence models through applications to simple flows, and then investigating more promising models in applications to complex flows. The complex flows were selected from the database developed in the first phase of the study. For these flows it was anticipated that model performance would not be entirely satisfactory, so that model improvements or corrections would be required. The primary goals of the investigation were essentially achieved. A large database of flows was collected and assessed, a number of additional hypersonic experiments were conducted in the Ames HWT, and two turbulence models (kappa-epsilon and kappa-omega models with corrections) were determined which gave superior performance for most of the flows studied and are now recommended for NASP applications.

  11. Complications of auricular correction

    PubMed Central

    Staindl, Otto; Siedek, Vanessa

    2008-01-01

    The risk of complications of auricular correction is underestimated. There is around a 5% risk of early complications (haematoma, infection, fistulae caused by stitches and granulomae, allergic reactions, pressure ulcers, feelings of pain and asymmetry in side comparison) and a 20% risk of late complications (recurrences, telehone ear, excessive edge formation, auricle fitting too closely, narrowing of the auditory canal, keloids and complete collapse of the ear). Deformities are evaluated less critically by patients than by the surgeons, providing they do not concern how the ear is positioned. The causes of complications and deformities are, in the vast majority of cases, incorrect diagnosis and wrong choice of operating procedure. The choice of operating procedure must be adapted to suit the individual ear morphology. Bandaging technique and inspections and, if necessary, early revision are of great importance for the occurence and progress of early complications, in addition to operation techniques. In cases of late complications such as keloids and auricles that are too closely fitting, unfixed full-thickness skin flaps have proved to be the most successful. Large deformities can often only be corrected to a limited degree of satisfaction. PMID:22073079

  12. Multistage vector (MSV) therapeutics.

    PubMed

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-12-10

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Strategies for therapeutic hypometabothermia

    PubMed Central

    Liu, Shimin; Chen, Jiang-Fan

    2013-01-01

    Although therapeutic hypothermia and metabolic suppression have shown robust neuroprotection in experimental brain ischemia, systemic complications have limited their use in treating acute stroke patients. The core temperature and basic metabolic rate are tightly regulated and maintained in a very stable level in mammals. Simply lowering body temperature or metabolic rate is actually a brutal therapy that may cause more systemic as well as regional problems other than providing protection. These problems are commonly seen in hypothermia and barbiturate coma. The main innovative concept of this review is to propose thermogenically optimal and synergistic reduction of core temperature and metabolic rate in therapeutic hypometabothermia using novel and clinically practical approaches. When metabolism and body temperature are reduced in a systematically synergistic manner, the outcome will be maximal protection and safe recovery, which happen in natural process, such as in hibernation, daily torpor and estivation. PMID:24179563

  14. Therapeutic use of cannabis.

    PubMed

    de Vries, Kay; Green, Anita J

    Therapeutic cannabis use raises a number of dilemmas for nurses. This article examines the legal, political and ethical challenges raised by the use of cannabis by people with life-limiting or terminal illnesses in their own homes. (Throughout this paper, the term cannabis refers to illegal cannabis unless specified.) A literature review of databases from 1996 was conducted and internet material was also examined. Evidence on the therapeutic use of cannabis suggests it may produce improvements in quality of life, which has led to increased use among people with life-limiting illnesses. The cannabis used is usually obtained illegally, which can have consequences for both those who use it and nurses who provide treatment in the community.

  15. Therapeutic Hypothermia for Neuroprotection

    PubMed Central

    Karnatovskaia, Lioudmila V.; Wartenberg, Katja E.

    2014-01-01

    The earliest recorded application of therapeutic hypothermia in medicine spans about 5000 years; however, its use has become widespread since 2002, following the demonstration of both safety and efficacy of regimens requiring only a mild (32°C-35°C) degree of cooling after cardiac arrest. We review the mechanisms by which hypothermia confers neuroprotection as well as its physiological effects by body system and its associated risks. With regard to clinical applications, we present evidence on the role of hypothermia in traumatic brain injury, intracranial pressure elevation, stroke, subarachnoid hemorrhage, spinal cord injury, hepatic encephalopathy, and neonatal peripartum encephalopathy. Based on the current knowledge and areas undergoing or in need of further exploration, we feel that therapeutic hypothermia holds promise in the treatment of patients with various forms of neurologic injury; however, additional quality studies are needed before its true role is fully known. PMID:24982721

  16. Toxicity of therapeutic nanoparticles.

    PubMed

    Maurer-Jones, Melissa A; Bantz, Kyle C; Love, Sara A; Marquis, Bryce J; Haynes, Christy L

    2009-02-01

    A total of six nanotherapeutic formulations are already approved for medical use and more are in the approval pipeline currently. Despite the massive research effort in nanotherapeutic materials, there is relatively little information about the toxicity of these materials or the tools needed to assess this toxicity. Recently, the scientific community has begun to respond to the paucity of information by investing in the field of nanoparticle toxicology. This review is intended to provide an overview of the techniques needed to assess toxicity of these therapeutic nanoparticles and to summarize the current state of the field. We begin with background on the toxicological assessment techniques used currently as well as considerations in nanoparticle dosing. The toxicological research overview is divided into the most common applications of therapeutic nanoparticles: drug delivery, photodynamic therapy and bioimaging. We end with a perspective section discussing the current technological gaps and promising research aimed at addressing those gaps.

  17. Complement-targeted therapeutics

    PubMed Central

    Ricklin, Daniel; Lambris, John D

    2010-01-01

    The complement system is a central component of innate immunity and bridges the innate to the adaptive immune response. However, it can also turn its destructive capabilities against host cells and is involved in numerous diseases and pathological conditions. Modulation of the complement system has been recognized as a promising strategy in drug discovery, and a large number of therapeutic modalities have been developed. However, successful marketing of complement-targeted drugs has proved to be more difficult than initially expected, and many strategies have been discontinued. The US Food and Drug Administration’s approval of the first complement-specific drug, an antibody against complement component C5 (eculizumab; Soliris), in March 2007, was a long-awaited breakthrough in the field. Approval of eculizumab validates the complement system as therapeutic target and might facilitate clinical development of other promising drug candidates. PMID:17989689

  18. Multistage vector (MSV) therapeutics

    PubMed Central

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-01-01

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers. PMID:26264836

  19. Therapeutic cancer vaccines

    PubMed Central

    Melief, Cornelis J.M.; van Hall, Thorbald; Arens, Ramon; Ossendorp, Ferry; van der Burg, Sjoerd H.

    2015-01-01

    The clinical benefit of therapeutic cancer vaccines has been established. Whereas regression of lesions was shown for premalignant lesions caused by HPV, clinical benefit in cancer patients was mostly noted as prolonged survival. Suboptimal vaccine design and an immunosuppressive cancer microenvironment are the root causes of the lack of cancer eradication. Effective cancer vaccines deliver concentrated antigen to both HLA class I and II molecules of DCs, promoting both CD4 and CD8 T cell responses. Optimal vaccine platforms include DNA and RNA vaccines and synthetic long peptides. Antigens of choice include mutant sequences, selected cancer testis antigens, and viral antigens. Drugs or physical treatments can mitigate the immunosuppressive cancer microenvironment and include chemotherapeutics, radiation, indoleamine 2,3-dioxygenase (IDO) inhibitors, inhibitors of T cell checkpoints, agonists of selected TNF receptor family members, and inhibitors of undesirable cytokines. The specificity of therapeutic vaccination combined with such immunomodulation offers an attractive avenue for the development of future cancer therapies. PMID:26214521

  20. Contact Lenses for Vision Correction

    MedlinePlus

    ... Ophthalmologist Patient Stories Español Eye Health / Glasses & Contacts Contact Lenses Sections Contact Lenses for Vision Correction Contact ... to Know About Contact Lenses Colored Contact Lenses Contact Lenses for Vision Correction Leer en Español: Lentes ...

  1. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  2. Applying model-based diagnostics to space power distribution

    NASA Astrophysics Data System (ADS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-03-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems which perform model-based diagnosis. A model-based diagnostic expert system for a Space Station Freedom electrical power distribution testbed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems, such as the testbed, as well. The expert system was developed using Marple and Lucid Common Lisp running on Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This paper describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  3. Embracing model-based designs for dose-finding trials

    PubMed Central

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-01-01

    Background: Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). Methods: We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. Results: We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators’ preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. Conclusions: There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia. PMID:28664918

  4. Antioxidant therapeutics for schizophrenia.

    PubMed

    Reddy, Ravinder; Reddy, Rajiv

    2011-10-01

    Pharmaceutical treatment for millions worldwide who have schizophrenia is limited to a handful of antipsychotics. Despite the proven efficacy of these drugs, the overall outcome for schizophrenia remains suboptimal. Thus, alternative treatment options are urgently needed. One possible approach may be antioxidant therapy. The extant evidence for the role of oxidative stress in the pathophysiology of schizophrenia offers a hypothesis-derived therapeutic approach in the form of antioxidants. Vitamins C and E, for example, are suitable for human clinical trials because they are readily available, inexpensive, and relatively safe. Research into the therapeutic use of antioxidants in schizophrenia can be grouped into two main clusters: for psychopathology and for side effects. Of these studies, some have been carefully conducted, but majority are open label. Use of antioxidants for treatment-related side effects has been more extensively investigated. The totality of the evidence to date suggests that specific antioxidants, such as N-acetyl cysteine, may offer tangible benefits for the clinical syndrome of schizophrenia, and vitamin E may offer salutary effects on glycemic effects of antipsychotics. However, a great deal of fundamental clinical research remains to be done before antioxidants can be routinely used therapeutically for schizophrenia and treatment-related complications.

  5. Polycyclic peptide therapeutics.

    PubMed

    Baeriswyl, Vanessa; Heinis, Christian

    2013-03-01

    Owing to their excellent binding properties, high stability, and low off-target toxicity, polycyclic peptides are an attractive molecule format for the development of therapeutics. Currently, only a handful of polycyclic peptides are used in the clinic; examples include the antibiotic vancomycin, the anticancer drugs actinomycin D and romidepsin, and the analgesic agent ziconotide. All clinically used polycyclic peptide drugs are derived from natural sources, such as soil bacteria in the case of vancomycin, actinomycin D and romidepsin, or the venom of a fish-hunting coil snail in the case of ziconotide. Unfortunately, nature provides peptide macrocyclic ligands for only a small fraction of therapeutic targets. For the generation of ligands of targets of choice, researchers have inserted artificial binding sites into natural polycyclic peptide scaffolds, such as cystine knot proteins, using rational design or directed evolution approaches. More recently, large combinatorial libraries of genetically encoded bicyclic peptides have been generated de novo and screened by phage display. In this Minireview, the properties of existing polycyclic peptide drugs are discussed and related to their interesting molecular architectures. Furthermore, technologies that allow the development of unnatural polycyclic peptide ligands are discussed. Recent application of these technologies has generated promising results, suggesting that polycyclic peptide therapeutics could potentially be developed for a broad range of diseases. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Proteases as therapeutics

    PubMed Central

    Craik, Charles S.; Page, Michael J.; Madison, Edwin L.

    2015-01-01

    Proteases are an expanding class of drugs that hold great promise. The U.S. FDA (Food and Drug Administration) has approved 12 protease therapies, and a number of next generation or completely new proteases are in clinical development. Although they are a well-recognized class of targets for inhibitors, proteases themselves have not typically been considered as a drug class despite their application in the clinic over the last several decades; initially as plasma fractions and later as purified products. Although the predominant use of proteases has been in treating cardiovascular disease, they are also emerging as useful agents in the treatment of sepsis, digestive disorders, inflammation, cystic fibrosis, retinal disorders, psoriasis and other diseases. In the present review, we outline the history of proteases as therapeutics, provide an overview of their current clinical application, and describe several approaches to improve and expand their clinical application. Undoubtedly, our ability to harness proteolysis for disease treatment will increase with our understanding of protease biology and the molecular mechanisms responsible. New technologies for rationally engineering proteases, as well as improved delivery options, will expand greatly the potential applications of these enzymes. The recognition that proteases are, in fact, an established class of safe and efficacious drugs will stimulate investigation of additional therapeutic applications for these enzymes. Proteases therefore have a bright future as a distinct therapeutic class with diverse clinical applications. PMID:21406063

  7. Therapeutic antibody engineering

    PubMed Central

    Parren, Paul W.H.I.; Lugovskoy, Alexey A.

    2013-01-01

    It is an important event in any knowledge area when an authority in the field decides that it is time to share all accumulated knowledge and learnings by writing a text book. This does not occur often in the biopharmaceutical industry, likely due to both the highly dynamic environment with tight timelines and policies and procedures at many pharmaceutical companies that hamper knowledge sharing. To take on a task like this successfully, a strong drive combined with a desire and talent to teach, but also an accommodating and stimulating environment is required. Luckily for those interested in therapeutic monoclonal antibodies, Dr. William R. Strohl decided about two years ago that the time was right to write a book about the past, present and future of these fascinating molecules. Dr. Strohl’s great expertise and passion for biotechnology is evident from his life story and his strong academic and industry track record. Dr. Strohl pioneered natural product biotechnology, first in academia as a full professor of microbiology and biochemistry at Ohio State University in Columbus, Ohio and later in industry while at Merck. Despite his notable advances in recombinant natural products, industry interest in this area waned and in 2001 Dr. Strohl sought new opportunities by entering the field of antibody therapeutics. He initiated antibody discovery through phage display at Merck, and then moved to Centocor Research and Development Inc. (now Janssen Biotech, Inc.) in 2008 to head Biologics Research, where he now directs the discovery of innovative therapeutic antibody candidates.

  8. Yearbook of Correctional Education 1989.

    ERIC Educational Resources Information Center

    Duguid, Stephen, Ed.

    This yearbook contains conference papers, commissioned papers, reprints of earlier works, and research-in-progress. They offer a retrospective view as well as address the mission and perspective of correctional education, its international dimension, correctional education in action, and current research. Papers include "Correctional Education and…

  9. 75 FR 16516 - Dates Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-01

    ... From the Federal Register Online via the Government Publishing Office ] NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Office of the Federal Register Dates Correction Correction In the Notices section beginning on page 15401 in the issue of March 29th, 2010, make the following correction: On pages...

  10. Political Correctness and Cultural Studies.

    ERIC Educational Resources Information Center

    Carey, James W.

    1992-01-01

    Discusses political correctness and cultural studies, dealing with cultural studies and the left, the conservative assault on cultural studies, and political correctness in the university. Describes some of the underlying changes in the university, largely unaddressed in the political correctness debate, that provide the deep structure to the…

  11. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  12. Spectroscopic imaging with prospective motion correction and retrospective phase correction.

    PubMed

    Lange, Thomas; Maclaren, Julian; Buechert, Martin; Zaitsev, Maxim

    2012-06-01

    Motion-induced artifacts are much harder to recognize in magnetic resonance spectroscopic imaging than in imaging experiments and can therefore lead to erroneous interpretation. A method for prospective motion correction based on an optical tracking system has recently been proposed and has already been successfully applied to single voxel spectroscopy. In this work, the utility of prospective motion correction in combination with retrospective phase correction is evaluated for spectroscopic imaging in the human brain. Retrospective phase correction, based on the interleaved reference scan method, is used to correct for motion-induced frequency shifts and ensure correct phasing of the spectra across the whole spectroscopic imaging slice. It is demonstrated that the presented correction methodology can reduce motion-induced degradation of spectroscopic imaging data. Copyright © 2011 Wiley-Liss, Inc.

  13. mb-FLIM: model-based fluorescence lifetime imaging

    NASA Astrophysics Data System (ADS)

    Zhao, Qiaole; Young, Ian Ted; Schouten, Raymond; Stallinga, Sjoerd; Jalink, Kees; de Jong, Sander

    2012-03-01

    We have developed a model-based, parallel procedure to estimate fluorescence lifetimes. Multiple frequencies are present in the excitation signal. Modeling the entire fluorescence and measurement process produces an analytical ratio of polynomials in the lifetime variable τ. A non-linear model-fitting procedure is then used to estimate τ. We have analyzed this model-based approach by simulating a 10 μM fluorescein solution (τ = 4 ns) and all relevant noise sources. We have used real LED data to drive the simulation. Using 240 μs of data, we estimate τ = 3.99 ns. Preliminary experiments on real fluorescent images taken from fluorescein solutions (measured τ = 4.1 ns), green plastic test slides (measured τ = 3.0 ns), and GFP in U2OS (osteosarcoma) cells (measured τ = 2.1 ns) demonstrate that this model-based measurement technique works.

  14. Model-based hierarchical reinforcement learning and human action control.

    PubMed

    Botvinick, Matthew; Weinstein, Ari

    2014-11-05

    Recent work has reawakened interest in goal-directed or 'model-based' choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour.

  15. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  16. EDITORIAL: Politically correct physics?

    NASA Astrophysics Data System (ADS)

    Pople Deputy Editor, Stephen

    1997-03-01

    If you were a caring, thinking, liberally minded person in the 1960s, you marched against the bomb, against the Vietnam war, and for civil rights. By the 1980s, your voice was raised about the destruction of the rainforests and the threat to our whole planetary environment. At the same time, you opposed discrimination against any group because of race, sex or sexual orientation. You reasoned that people who spoke or acted in a discriminatory manner should be discriminated against. In other words, you became politically correct. Despite its oft-quoted excesses, the political correctness movement sprang from well-founded concerns about injustices in our society. So, on balance, I am all for it. Or, at least, I was until it started to invade science. Biologists were the first to feel the impact. No longer could they refer to 'higher' and 'lower' orders, or 'primitive' forms of life. To the list of undesirable 'isms' - sexism, racism, ageism - had been added a new one: speciesism. Chemists remained immune to the PC invasion, but what else could you expect from a group of people so steeped in tradition that their principal unit, the mole, requires the use of the thoroughly unreconstructed gram? Now it is the turn of the physicists. This time, the offenders are not those who talk disparagingly about other people or animals, but those who refer to 'forms of energy' and 'heat'. Political correctness has evolved into physical correctness. I was always rather fond of the various forms of energy: potential, kinetic, chemical, electrical, sound and so on. My students might merge heat and internal energy into a single, fuzzy concept loosely associated with moving molecules. They might be a little confused at a whole new crop of energies - hydroelectric, solar, wind, geothermal and tidal - but they could tell me what devices turned chemical energy into electrical energy, even if they couldn't quite appreciate that turning tidal energy into geothermal energy wasn't part of the

  17. Updating and correction.

    PubMed

    1994-09-09

    The current editions of two books edited by William T. Golden, Science Advice to the President and Science and Technology Advice to the President, Congress, and Judiciary, published this year by AAAS Press, are now being distributed by Transaction Publishers, New Brunswick, NJ 08903, at the prices $22.95 and $27.95 (paper), respectively, and are no longer available from AAAS. A related work, Golden's 1991 compilation Worldwide Science and Technology Advice to the Highest Levels of Government, originally published by Pergamon Press, is also being distributed by Transaction Publishers, at $25.95. For more information about the books see Science 1 July, p. 127. In the review of K. S. Thorne's Black Holes and Time Warps (13 May, p. 999-1000), the captions and illustrations on p. 1000 were mismatched. The correct order of the captions is (i) "A heavy rock..."; (ii) "Cosmic radio waves..."; and (iii) "The trajectories in space...."

  18. Endoscopic orientation correction.

    PubMed

    Höller, Kurt; Penne, Jochen; Schneider, Armin; Jahn, Jasper; Guttiérrez Boronat, Javier; Wittenberg, Thomas; Feussner, Hubertus; Hornegger, Joachim

    2009-01-01

    An open problem in endoscopic surgery (especially with flexible endoscopes) is the absence of a stable horizon in endoscopic images. With our "Endorientation" approach image rotation correction, even in non-rigid endoscopic surgery (particularly NOTES), can be realized with a tiny MEMS tri-axial inertial sensor placed on the tip of an endoscope. It measures the impact of gravity on each of the three orthogonal accelerometer axes. After an initial calibration and filtering of these three values the rotation angle is estimated directly. Achievable repetition rate is above the usual endoscopic video frame rate of 30 Hz; accuracy is about one degree. The image rotation is performed in real-time by digitally rotating the analog endoscopic video signal. Improvements and benefits have been evaluated in animal studies: Coordination of different instruments and estimation of tissue behavior regarding gravity related deformation and movement was rated to be much more intuitive with a stable horizon on endoscopic images.

  19. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  20. Anomaly corrected heterotic horizons

    NASA Astrophysics Data System (ADS)

    Fontanella, A.; Gutowski, J. B.; Papadopoulos, G.

    2016-10-01

    We consider supersymmetric near-horizon geometries in heterotic supergravity up to two loop order in sigma model perturbation theory. We identify the conditions for the horizons to admit enhancement of supersymmetry. We show that solutions which undergo supersymmetry enhancement exhibit an {s}{l}(2,{R}) symmetry, and we describe the geometry of their horizon sections. We also prove a modified Lichnerowicz type theorem, incorporating α' corrections, which relates Killing spinors to zero modes of near-horizon Dirac operators. Furthermore, we demonstrate that there are no AdS2 solutions in heterotic supergravity up to second order in α' for which the fields are smooth and the internal space is smooth and compact without boundary. We investigate a class of nearly supersymmetric horizons, for which the gravitino Killing spinor equation is satisfied on the spatial cross sections but not the dilatino one, and present a description of their geometry.

  1. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  2. Rethinking political correctness.

    PubMed

    Ely, Robin J; Meyerson, Debra E; Davidson, Martin N

    2006-09-01

    Legal and cultural changes over the past 40 years ushered unprecedented numbers of women and people of color into companies' professional ranks. Laws now protect these traditionally underrepresented groups from blatant forms of discrimination in hiring and promotion. Meanwhile, political correctness has reset the standards for civility and respect in people's day-to-day interactions. Despite this obvious progress, the authors' research has shown that political correctness is a double-edged sword. While it has helped many employees feel unlimited by their race, gender, or religion,the PC rule book can hinder people's ability to develop effective relationships across race, gender, and religious lines. Companies need to equip workers with skills--not rules--for building these relationships. The authors offer the following five principles for healthy resolution of the tensions that commonly arise over difference: Pause to short-circuit the emotion and reflect; connect with others, affirming the importance of relationships; question yourself to identify blind spots and discover what makes you defensive; get genuine support that helps you gain a broader perspective; and shift your mind-set from one that says, "You need to change," to one that asks, "What can I change?" When people treat their cultural differences--and related conflicts and tensions--as opportunities to gain a more accurate view of themselves, one another, and the situation, trust builds and relationships become stronger. Leaders should put aside the PC rule book and instead model and encourage risk taking in the service of building the organization's relational capacity. The benefits will reverberate through every dimension of the company's work.

  3. Do Model-Based Studies in Chronic Obstructive Pulmonary Disease Measure Correct Values of Utility? A Meta-Analysis.

    PubMed

    Moayeri, Foruhar; Hsueh, Ya-Seng Arthur; Clarke, Philip; Dunt, David

    2016-06-01

    Chronic obstructive pulmonary disease (COPD) is a progressive chronic disease that has considerable impact on utility-based health-related quality of life. Utility is a key input of many decision analytic models used for economic evaluations. To systematically review COPD-related utilities and to compare these with alternative values used in decision models. The literature review comprised studies that generated utilities for COPD-related stages based on EuroQol five-dimensional questionnaire surveys of patients and of decision models of COPD progression that have been used for economic evaluations. The utility values used in modeling studies and those from the meta-analysis of actual patient-level studies were compared and differences quantified. Twenty decision modeling studies that used utility value as an input parameter were found. Within the same span of publication period, 13 studies involving patient-level utility data were identified and included in the meta-analysis. The estimated mean utility values ranged from 0.806 (95% confidence interval [CI] 0.747-0.866) for stage I to 0.616 (95% CI 0.556-0.676) for stage IV. The utility scores for comparable stages in modeling studies were different (significant difference 0.045 [95% CI 0.041-0.052] for stage III). Modeling studies consistently used higher utility values than the average reported patient-level data. COPD decision analytic models are based on a limited range of utility values that are systematically different from average values estimated using a meta-analysis. A more systematic approach in the application of utility measures in economic evaluation is required to appropriately reflect current literature. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. [Corrected transposition of the great arteries].

    PubMed

    Alva-Espinosa, Carlos

    2016-01-01

    Corrected transposition of the great arteries is one of the most fascinating entities in congenital heart disease. The apparent corrected condition is only temporal. Over time, most patients develop systemic heart failure, even in the absence of associated lesions. With current imaging studies, precise visualization is achieved in each case though the treatment strategy remains unresolved. In asymptomatic patients or cases without associated lesions, focalized follow-up to assess systemic ventricular function and the degree of tricuspid valve regurgitation is important. In cases with normal ventricular function and mild tricuspid failure, it seems unreasonable to intervene surgically. In patients with significant associated lesions, surgery is indicated. In the long term, the traditional approach may not help tricuspid regurgitation and systemic ventricular failure. Anatomical correction is the proposed alternative to ease the right ventricle overload and to restore the systemic left ventricular function. However, this is a prolonged operation and not without risks and long-term complications. In this review the clinical, diagnostic, and therapeutic aspects are overviewed in the light of the most significant and recent literature.

  5. Outcome following therapeutic abortion.

    PubMed

    Payne, E C; Kravitz, A R; Notman, M T; Anderson, J V

    1976-06-01

    Psychological outcome of abortion was studied in 102 patients, measuring multiple variables over four time intervals. Five measured affects--anxiety, depression, anger, guilt, and shame-were significantly lower six months after the preabortion period. The following variables describe subgroups of patients with significant variations in patterns of responses as indicated by changes in affects: marital status, personality diagnosis, character of object relations, past psychopathologic factors, relationship to husband or lover, relationship to mother, ambivalence about abortion, religion, and previous parity. A complex multivariate model, based on conflict and conflict resolution, is appropriate to conceptualize, the unwanted pregnancy and abortion experience. Data suggest that women most vulnerable to conflict are those who are single and nulliparous, those with previous history of serious emotional problems, conflictual relationships to lovers, past negative relationships to mother, strong ambivalence toward abortion, or negative religious or cultural attitudes about abortion.

  6. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  7. A Model Based Mars Climate Database for the Mission Design

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A viewgraph presentation on a model based climate database is shown. The topics include: 1) Why a model based climate database?; 2) Mars Climate Database v3.1 Who uses it ? (approx. 60 users!); 3) The new Mars Climate database MCD v4.0; 4) MCD v4.0: what's new ? 5) Simulation of Water ice clouds; 6) Simulation of Water ice cycle; 7) A new tool for surface pressure prediction; 8) Acces to the database MCD 4.0; 9) How to access the database; and 10) New web access

  8. Extending model-based diagnosis for analog thermodynamical devices

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas; Chien, Steve; Robertson, Charles

    1993-01-01

    The increasing complexity of process control applications have posed difficult problems in fault detection, isolation, and recovery. Deep knowledge-based approaches, such as model-based diagnosis, have offered some promise in addressing these problems. However, the difficulties of adapting these techniques to situations involving numerical reasoning and noise have limited the applicability of these techniques. This paper describes an extension of classical model-based diagnosis techniques to deal with sparse data, noise, and complex noninvertible numerical models. These diagnosis techniques are being applied to the External Active Thermal Control System for Space Station Freedom.

  9. Cascaded process model based control: packed absorption column application.

    PubMed

    Govindarajan, Anand; Jayaraman, Suresh Kumar; Sethuraman, Vijayalakshmi; Raul, Pramod R; Rhinehart, R Russell

    2014-03-01

    Nonlinear, adaptive, process-model based control is demonstrated in a cascaded single-input-single-output mode for pressure drop control in a pilot-scale packed absorption column. The process is shown to be nonlinear. Control is demonstrated in both servo and regulatory modes, for no wind-up in a constrained situation, and for bumpless transfer. Model adaptation is demonstrated and shown to provide process insight. The application procedure is revealed as a design guide to aid others in implementing process-model based control.

  10. The impact of a model-based clinical regional registry for attention-deficit hyperactivity disorder.

    PubMed

    Zanetti, Michele; Cartabia, Massimo; Didoni, Anna; Fortinguerra, Filomena; Reale, Laura; Mondini, Matteo; Bonati, Maurizio

    2016-03-17

    This article describes the development and clinical impact of the Italian Regional ADHD Registry, aimed at collecting and monitoring diagnostic and therapeutic pathways of care for attention-deficit hyperactivity disorder children and adolescents, launched by the Italian Lombardy Region in June 2011. In particular, the model-based software used to run the registry and manage clinical care data acquisition and monitoring, is described. This software was developed using the PROSAFE programme, which is already used for data collection in many Italian intensive care units, as a stand-alone interface case report form. The use of the attention-deficit hyperactivity disorder regional registry led to an increase in the appropriateness of the clinical management of all patients included in the registry, proving to be an important instrument in ensuring an appropriate healthcare strategy for children and adolescents with attention-deficit/hyperactivity disorder.

  11. Relativistic quantum corrections to laser wakefield acceleration.

    PubMed

    Zhu, Jun; Ji, Peiyong

    2010-03-01

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons.

  12. Relativistic quantum corrections to laser wakefield acceleration

    SciTech Connect

    Zhu Jun; Ji Peiyong

    2010-03-15

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons.

  13. Management of antipsychotic treatment discontinuation and interruptions using model-based simulations

    PubMed Central

    Samtani, Mahesh N; Sheehan, John J; Fu, Dong-Jing; Remmerie, Bart; Sliwa, Jennifer Kern; Alphs, Larry

    2012-01-01

    Background Medication nonadherence is a well described and prevalent clinical occurrence in schizophrenia. These pharmacokinetic model-based simulations analyze predicted antipsychotic plasma concentrations in nonadherence and treatment interruption scenarios and with treatment reinitiation. Methods Starting from steady state, pharmacokinetic model-based simulations of active moiety plasma concentrations of oral, immediate-release risperidone 3 mg/day, risperidone long-acting injection 37.5 mg/14 days, oral paliperidone extended-release 6 mg/day, and paliperidone palmitate 117 mg (75 mg equivalents)/28 days were assessed under three treatment discontinuation/interruption scenarios, ie, complete discontinuation, one week of interruption, and four weeks of interruption. In the treatment interruption scenarios, pharmacokinetic simulations were performed using medication-specific reinitiation strategies. Results Following complete treatment discontinuation, plasma concentrations persisted longest with paliperidone palmitate, followed by risperidone long-acting injection, while oral formulations exhibited the most rapid decrease. One week of oral paliperidone or risperidone interruption resulted in near complete elimination from the systemic circulation within that timeframe, reflecting the rapid elimination rate of the active moiety. After 1 and 4 weeks of interruption, minimum plasma concentrations were higher with paliperidone palmitate than risperidone long-acting injection over the simulated period. Four weeks of treatment interruption followed by reinitiation resulted in plasma levels returning to predicted therapeutic levels within 1 week. Conclusion Due to the long half-life of paliperidone palmitate (25–49 days), putative therapeutic plasma concentrations persisted longest in simulated cases of complete discontinuation or treatment interruption. These simulations may help clinicians better conceptualize the impact of antipsychotic nonadherence on plasma

  14. Paediatric models in motion: requirements for model-based decision support at the bedside

    PubMed Central

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  15. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  16. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    PubMed

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics.

  17. The Therapeutic Roller Coaster

    PubMed Central

    CHU, JAMES A.

    1992-01-01

    Survivors of severe childhood abuse often encounter profound difficulties. In addition to posttraumatic and dissociative symptomatology, abuse survivors frequently have characterologic problems, particularly regarding self-care and maintaining relationships. Backgrounds of abuse, abandonment, and betrayal are often recapitulated and reenacted in therapy, making the therapeutic experience arduous and confusing for therapists and patients. Efforts must be directed at building an adequate psychotherapeutic foundation before undertaking exploration and abreaction of past traumatic experiences. This discussion sets out a model for treatment of childhood abuse survivors, describing stages of treatment and suggesting interventions. Common treatment dilemmas or "traps" are discussed, with recommendations for their resolution. PMID:22700116

  18. [Therapeutic patient education revisited].

    PubMed

    Ruiz, Juan

    2014-06-04

    The therapeutic patient education is an absolute necessity in the management of chronic diseases including diabetes. This discipline promotes personal autonomy to live optimally, to achieve personal and professional projects, despite the constraints of the disease and treatments. The DAWN2 study demonstrates the systemic effects of this disease that goes beyond simple glycemic control. The biopsychosocial dimension needs to be better explored. Other assessment tools should be used to better manage these patients. Exploring the health literacy and numeracy are other tools that can explore the problems for socially disadvantaged patients. The main goal is development of patient capabilities and his environment for the development of the human person.

  19. The therapeutic helminth?

    PubMed

    McKay, Derek M

    2009-03-01

    By definition, parasites harm their hosts. Yet substantial evidence from animal models of human disease support the hypothesis that infection with helminths can suppress the development of other maladies. Here, the view is presented that assessment of the immunophysiological response to helminths could identify that infection with specific parasites would be therapeutically useful (although many helminths could not fulfil this role) and lead to precise knowledge of the immune events following infection, to identify ways to intervene in disease processes (in the absence of infection per se) that can be used to treat, and eventually cure, inflammatory and autoimmune disease.

  20. Therapeutic approaches to cellulite.

    PubMed

    Green, Jeremy B; Cohen, Joel L; Kaufman, Joely; Metelitsa, Andrei I; Kaminer, Michael S

    2015-09-01

    Cellulite is a condition that affects the vast majority of women. Although it is of no danger to one's overall health, cellulite can be psychosocially debilitating. Consequently, much research has been devoted to understanding cellulite and its etiopathogenesis. With additional insights into the underlying causes of its clinical presentation, therapeutic modalities have been developed that offer hope to cellulite sufferers. This review examines evidence for topical treatments, noninvasive energy-based devices, and recently developed minimally invasive interventions that may finally provide a solution. ©2015 Frontline Medical Communications.

  1. [Achievement of therapeutic objectives].

    PubMed

    Mantilla, Teresa

    2014-07-01

    Therapeutic objectives for patients with atherogenic dyslipidemia are achieved by improving patient compliance and adherence. Clinical practice guidelines address the importance of treatment compliance for achieving objectives. The combination of a fixed dose of pravastatin and fenofibrate increases the adherence by simplifying the drug regimen and reducing the number of daily doses. The good tolerance, the cost of the combination and the possibility of adjusting the administration to the patient's lifestyle helps achieve the objectives for these patients with high cardiovascular risk. Copyright © 2014 Sociedad Española de Arteriosclerosis y Elsevier España, S.L. All rights reserved.

  2. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil

    2012-01-01

    Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.

  3. [Research on point cloud smoothing in knee joint prosthesis modeling based on reverse engineering].

    PubMed

    Zhang, Guoliang; Yao, Jin; Wei, Xing; Pei, Fuxing; Zhou, Zongke

    2008-10-01

    At present, foreign standard knee joint prosthesis is mostly used in clinical practice; it can well represent the biological characteristic of knee joint on human being. So this paper adopts the reverse engineering technology in that connexion, presents novel positioning method of acquiring the point data on the surface of knee joint prosthesis, puts forward the algorithm of three-point angle method for removing the noise error and correcting the noise error based on the least squares plane to smooth point cloud. And then, the surface of knee joint prosthesis with better accuracy and smoothness can be generated. Finally, the knee joint prosthesis model can be generated. Thus, a basis is provided for the localization of knee joint prosthesis. This new algorithm is mainly used for the surface modeling based on point cloud smoothing, including the surface of knee joint prosthesis, the surface of regular shape, and the surface with gentle change in curvature.

  4. 3D reconstruction of organ surfaces using model-based snakes.

    PubMed

    Tolxdorff, Thomas; Derz, Claus

    2003-01-01

    In this article a new segmentation approach is described that is based on case-based reasoning and a combination of various established image processing concepts described in the current literature. Previously segmented data sets are used as anatomical models that represent the cases, called reference models. They describe the expected surface shape and representation of the organ in the data material. The segmentation task is solved by finding a reference model that is similar to the current data set and then by adapting the reference segmentation to the current data set. Image segmentation can be divided into the steps "determination of the image context", "selection and adjustment of the reference model", and "application of the model-based snake". The necessary interaction time was reduced by more than 60%, including postprocessing to correct for possible segmentation errors.

  5. Improved model-based infrared reflectrometry for measuring deep trench structures.

    PubMed

    Zhang, Chuanwei; Liu, Shiyuan; Shi, Tielin; Tang, Zirong

    2009-11-01

    Model-based infrared reflectrometry (MBIR) has been introduced recently for characterization of high-aspect-ratio deep trench structures in microelectronics. The success of this technique relies heavily on accurate modeling of trench structures and fast extraction of trench parameters. In this paper, we propose a modeling method named corrected effective medium approximation (CEMA) for accurate and fast reflectivity calculation of deep trench structures. We also develop a method combining an artificial neural network (ANN) and a Levenberg-Marquardt (LM) algorithm for robust and fast extraction of geometric parameters from the measured reflectance spectrum. The simulation and experimental work conducted on typical deep trench structures has verified the proposed methods and demonstrated that the improved MBIR metrology achieves highly accurate measurement results as well as fast computation speed.

  6. Model-based near-wall reconstructions for immersed-boundary methods

    NASA Astrophysics Data System (ADS)

    Posa, Antonio; Balaras, Elias

    2014-08-01

    In immersed-boundary methods, the cost of resolving the thin boundary layers on a solid boundary at high Reynolds numbers is prohibitive. In the present work, we propose a new model-based, near-wall reconstruction to account for the lack of resolution and provide the correct wall shear stress and hydrodynamic forces. The models are analytical versions of a generalized version of the two-layer model developed by Balaras et al. (AIAA J 34:1111-1119, 1996) for large-eddy simulations. We will present the results for the flow around a cylinder and a sphere, where we use Cartesian and cylindrical coordinate grids. We will demonstrate that the proposed treatment reproduces very accurately the wall stress on grids, which are one order of magnitude coarser compared to well-resolved simulations.

  7. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  8. Impact of Model-Based Teaching on Argumentation Skills

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral; Belek, Deniz Eren

    2014-01-01

    The purpose of this study was to examine effects of model-based teaching on students' argumentation skills. Experimental design guided to the research. The participants of the study were pre-service physics teachers. The argumentative intervention lasted seven weeks. Data for this research were collected via video recordings and written arguments.…

  9. A new distance measure for model-based sequence clustering.

    PubMed

    García-García, Darío; Parrado Hernández, Emilio; Díaz-de María, Fernando

    2009-07-01

    We review the existing alternatives for defining model-based distances for clustering sequences and propose a new one based on the Kullback-Leibler divergence. This distance is shown to be especially useful in combination with spectral clustering. For improved performance in real-world scenarios, a model selection scheme is also proposed.

  10. Evaluation Novelty in Modeling-Based and Interactive Engagement Instruction

    ERIC Educational Resources Information Center

    Örnek, Funda

    2007-01-01

    A calculus-based introductory physics course, which is based on the Matter and Interactions curriculum of Chabay and Sherwood (2002), has been taught at Purdue University. Characteristic of this course is its emphasis on modeling. Therefore, I would like to investigate the effects of modeling-based instruction and interactive engagement on…

  11. Problem Solving: Physics Modeling-Based Interactive Engagement

    ERIC Educational Resources Information Center

    Ornek, Funda

    2009-01-01

    The purpose of this study was to investigate how modeling-based instruction combined with an interactive-engagement teaching approach promotes students' problem solving abilities. I focused on students in a calculus-based introductory physics course, based on the matter and interactions curriculum of Chabay & Sherwood (2002) at a large state…

  12. Model-based drug development: the road to quantitative pharmacology.

    PubMed

    Zhang, Liping; Sinha, Vikram; Forgue, S Thomas; Callies, Sophie; Ni, Lan; Peck, Richard; Allerheiligen, Sandra R B

    2006-06-01

    High development costs and low success rates in bringing new medicines to the market demand more efficient and effective approaches. Identified by the FDA as a valuable prognostic tool for fulfilling such a demand, model-based drug development is a mathematical and statistical approach that constructs, validates, and utilizes disease models, drug exposure-response models, and pharmacometric models to facilitate drug development. Quantitative pharmacology is a discipline that learns and confirms the key characteristics of new molecular entities in a quantitative manner, with goal of providing explicit, reproducible, and predictive evidence for optimizing drug development plans and enabling critical decision making. Model-based drug development serves as an integral part of quantitative pharmacology. This work reviews the general concept, basic elements, and evolving role of model-based drug development in quantitative pharmacology. Two case studies are presented to illustrate how the model-based drug development approach can facilitate knowledge management and decision making during drug development. The case studies also highlight the organizational learning that comes through implementation of quantitative pharmacology as a discipline. Finally, the prospects of quantitative pharmacology as an emerging discipline are discussed. Advances in this discipline will require continued collaboration between academia, industry and regulatory agencies.

  13. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    SciTech Connect

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  14. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    SciTech Connect

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-15

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated.

  15. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  16. Models Based Practices in Physical Education: A Sociocritical Reflection

    ERIC Educational Resources Information Center

    Landi, Dillon; Fitzpatrick, Katie; McGlashan, Hayley

    2016-01-01

    In this paper, we reflect on models-based practices in physical education using a sociocritical lens. Drawing links between neoliberal moves in education, and critical approaches to the body and physicality, we take a view that models are useful tools that are worth integrating into physical education, but we are apprehensive to suggest they…

  17. Wind field model-based estimation of Seasat scatterometer winds

    NASA Technical Reports Server (NTRS)

    Long, David G.

    1993-01-01

    A model-based approach to estimating near-surface wind fields over the ocean from Seasat scatterometer (SASS) measurements is presented. The approach is a direct assimilation technique in which wind field model parameters are estimated directly from the scatterometer measurements of the radar backscatter of the ocean's surface using maximum likelihood principles. The wind field estimate is then computed from the estimated model parameters. The wind field model used in this approach is based on geostrophic approximation and on simplistic assumptions about the wind field vorticity and divergence but includes ageostrophic winds. Nine days of SASS data were processed to obtain unique wind estimates. Comparisons in performance to the traditional two-step (point-wise wind retrieval followed by ambiguity removal) wind estimate method and the model-based method are provided using both simulated radar backscatter measurements and actual SASS measurements. In the latter case the results are compared to wind fields determined using subjective ambiguity removal. While the traditional approach results in missing measurements and reduced effective swath width due to fore/aft beam cell coregistration problems, the model-based approach uses all available measurements to increase the effective swath width and to reduce data gaps. The results reveal that the model-based wind estimates have accuracy comparable to traditionally estimated winds with less 'noise' in the directional estimates, particularly at low wind speeds.

  18. A simple computational algorithm of model-based choice preference.

    PubMed

    Toyama, Asako; Katahira, Kentaro; Ohira, Hideki

    2017-06-01

    A broadly used computational framework posits that two learning systems operate in parallel during the learning of choice preferences-namely, the model-free and model-based reinforcement-learning systems. In this study, we examined another possibility, through which model-free learning is the basic system and model-based information is its modulator. Accordingly, we proposed several modified versions of a temporal-difference learning model to explain the choice-learning process. Using the two-stage decision task developed by Daw, Gershman, Seymour, Dayan, and Dolan (2011), we compared their original computational model, which assumes a parallel learning process, and our proposed models, which assume a sequential learning process. Choice data from 23 participants showed a better fit with the proposed models. More specifically, the proposed eligibility adjustment model, which assumes that the environmental model can weight the degree of the eligibility trace, can explain choices better under both model-free and model-based controls and has a simpler computational algorithm than the original model. In addition, the forgetting learning model and its variation, which assume changes in the values of unchosen actions, substantially improved the fits to the data. Overall, we show that a hybrid computational model best fits the data. The parameters used in this model succeed in capturing individual tendencies with respect to both model use in learning and exploration behavior. This computational model provides novel insights into learning with interacting model-free and model-based components.

  19. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  20. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  1. Therapeutic endoscopy in gastroenterology.

    PubMed

    Celiński, K; Cichoz-Lach, H

    2007-08-01

    The role of therapeutic endoscopy in current gastroenterology is very important. Therapuetic endoscopy is useful in treatment of gastrointestinal bleeding. Endoscopic control of gastrointestinal bleeding includes the following procedures of haemostasis techniques: photocoagulation, electrocoagulation, thermocoagulation and injection method. Owing to these procedures mortality has significantly decreased. Endoscopic hemostasis eliminates the risk of surgery, is less expensive and better tolerated by patients. Colonoscopic polypectomy is a widely used technique. By removal of polyps the incidence of colon cancer can be decreased. The "hot biopsy" forceps can be used to excise polyps of up to 6 mm. Larger polyps can be removed safely by snare electrocautery and retrieved for histologic study. Endoscopic retrograde cholangiopancreatography has a therapeutic application designed to cut the sphincter of Oddi fibers of the distal common bile duct, what is indicated currently in choledocholithiasis and papillary stenosis with ascending cholangitis, acute gallstone pancreatitis. Endoscopic sphincterotomy in now an established procedure that is indicated in patients with common bile duct calculi. Endoscopic decompression of the biliary tree - dilatation benign structures of the biliary tree with baloon catheters and placement an internal endoprothesis allows the nonoperative decompression and significant palliation for patients with obstructing tumors.

  2. Person-centered Therapeutics

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    A clinician’s effectiveness in treatment depends substantially on his or her attitude toward -- and understanding of -- the patient as a person endowed with self-awareness and the will to direct his or her own future. The assessment of personality in the therapeutic encounter is a crucial foundation for forming an effective working alliance with shared goals. Helping a person to reflect on their personality provides a mirror image of their strengths and weaknesses in adapting to life’s many challenges. The Temperament and Character Inventory (TCI) provides an effective way to describe personality thoroughly and to predict both the positive and negative aspects of health. Strengths and weaknesses in TCI personality traits allow strong predictions of individual differences of all aspects of well-being. Diverse therapeutic techniques, such as diet, exercise, mood self-regulation, meditation, or acts of kindness, influence health and personality development in ways that are largely indistinguishable from one another or from effective allopathic treatments. Hence the development of well-being appears to be the result of activating a synergistic set of mechanisms of well-being, which are expressed as fuller functioning, plasticity, and virtue in adapting to life’s challenges PMID:26052429

  3. Mechanisms of Plasma Therapeutics

    NASA Astrophysics Data System (ADS)

    Graves, David

    2015-09-01

    In this talk, I address research directed towards biomedical applications of atmospheric pressure plasma such as sterilization, surgery, wound healing and anti-cancer therapy. The field has seen remarkable growth in the last 3-5 years, but the mechanisms responsible for the biomedical effects have remained mysterious. It is known that plasmas readily create reactive oxygen species (ROS) and reactive nitrogen species (RNS). ROS and RNS (or RONS), in addition to a suite of other radical and non-radical reactive species, are essential actors in an important sub-field of aerobic biology termed ``redox'' (or oxidation-reduction) biology. It is postulated that cold atmospheric plasma (CAP) can trigger a therapeutic shielding response in tissue in part by creating a time- and space-localized, burst-like form of oxy-nitrosative stress on near-surface exposed cells through the flux of plasma-generated RONS. RONS-exposed surface layers of cells communicate to the deeper levels of tissue via a form of the ``bystander effect,'' similar to responses to other forms of cell stress. In this proposed model of CAP therapeutics, the plasma stimulates a cellular survival mechanism through which aerobic organisms shield themselves from infection and other challenges.

  4. Engineering therapeutic protein disaggregases

    PubMed Central

    Shorter, James

    2016-01-01

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson’s disease (PD), and Alzheimer’s disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  5. Nitrones as Therapeutics

    PubMed Central

    Floyd, Robert A.; Kopke, Richard D.; Choi, Chul-Hee; Foster, Steven B.; Doblas, Sabrina; Towner, Rheal A.

    2008-01-01

    Nitrones have the general chemical formula X-CH=NO-Y. They were first used to trap free radicals in chemical systems and then subsequently in biochemical systems. More recently several nitrones including PBN (α-phenyl-tert-butylnitrone) have been shown to have potent biological activity in many experimental animal models. Many diseases of aging including stroke, cancer development, Parkinson’s disease and Alzheimer’s disease are known to have enhanced levels of free radicals and oxidative stress. Some derivatives of PBN are significantly more potent than PBN and have undergone extensive commercial development in stroke. Recent research has shown that PBN-related nitrones also have anti-cancer activity in several experimental cancer models and have potential as therapeutics in some cancers. Also in recent observations nitrones have been shown to act synergistically in combination with antioxidants in the prevention of acute acoustic noise induced hearing loss. The mechanistic basis of the potent biological activity of PBN-related nitrones is not known. Even though PBN-related nitrones do decrease oxidative stress and oxidative damage, their potent biological anti-inflammatory activity and their ability to alter cellular signaling processes can not readily be explained by conventional notions of free radical trapping biochemistry. This review is focused on our observations and others where the use of selected nitrones as novel therapeutics have been evaluated in experimental models in the context of free radical biochemical and cellular processes considered important in pathologic conditions and age-related diseases. PMID:18793715

  6. Data Entry Errors and Design for Model-Based Tight Glycemic Control in Critical Care

    PubMed Central

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Introduction Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. Method To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. Results The final data entry method tested reduced errors to less than 1–2%, a 60–80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0–20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. Conclusions The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. PMID:22401331

  7. Adaptive model-based control systems and methods for controlling a gas turbine

    NASA Technical Reports Server (NTRS)

    Brunell, Brent Jerome (Inventor); Mathews, Jr., Harry Kirk (Inventor); Kumar, Aditya (Inventor)

    2004-01-01

    Adaptive model-based control systems and methods are described so that performance and/or operability of a gas turbine in an aircraft engine, power plant, marine propulsion, or industrial application can be optimized under normal, deteriorated, faulted, failed and/or damaged operation. First, a model of each relevant system or component is created, and the model is adapted to the engine. Then, if/when deterioration, a fault, a failure or some kind of damage to an engine component or system is detected, that information is input to the model-based control as changes to the model, constraints, objective function, or other control parameters. With all the information about the engine condition, and state and directives on the control goals in terms of an objective function and constraints, the control then solves an optimization so the optimal control action can be determined and taken. This model and control may be updated in real-time to account for engine-to-engine variation, deterioration, damage, faults and/or failures using optimal corrective control action command(s).

  8. Hierarchical searching in model-based LADAR ATR using statistical separability tests

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen; Sobel, Erik; Douglas, Joel

    2006-05-01

    In this work we investigate simultaneous object identification improvement and efficient library search for model-based object recognition applications. We develop an algorithm to provide efficient, prioritized, hierarchical searching of the object model database. A common approach to model-based object recognition chooses the object label corresponding to the best match score. However, due to corrupting effects the best match score does not always correspond to the correct object model. To address this problem, we propose a search strategy which exploits information contained in a number of representative elements of the library to drill down to a small class with high probability of containing the object. We first optimally partition the library into a hierarchic taxonomy of disjoint classes. A small number of representative elements are used to characterize each object model class. At each hierarchy level, the observed object is matched against the representative elements of each class to generate score sets. A hypothesis testing problem, using a distribution-free statistical test, is defined on the score sets and used to choose the appropriate class for a prioritized search. We conduct a probabilistic analysis of the computational cost savings, and provide a formula measuring the computational advantage of the proposed approach. We generate numerical results using match scores derived from matching highly-detailed CAD models of civilian ground vehicles used in 3-D LADAR ATR. We present numerical results showing effects on classification performance of significance level and representative element number in the score set hypothesis testing problem.

  9. Model-based coding of facial images based on facial muscle motion through isodensity maps

    NASA Astrophysics Data System (ADS)

    So, Ikken; Nakamura, Osamu; Minami, Toshi

    1991-11-01

    A model-based coding system has come under serious consideration for the next generation of image coding schemes, aimed at greater efficiency in TV telephone and TV conference systems. In this model-based coding system, the sender's model image is transmitted and stored at the receiving side before the start of the conversation. During the conversation, feature points are extracted from the facial image of the sender and are transmitted to the receiver. The facial expression of the sender facial is reconstructed from the feature points received and a wireframed model constructed at the receiving side. However, the conventional methods have the following problems: (1) Extreme changes of the gray level, such as in wrinkles caused by change of expression, cannot be reconstructed at the receiving side. (2) Extraction of stable feature points from facial images with irregular features such as spectacles or facial hair is very difficult. To cope with the first problem, a new algorithm based on isodensity lines which can represent detailed changes in expression by density correction has already been proposed and good results obtained. As for the second problem, we propose in this paper a new algorithm to reconstruct facial images by transmitting other feature points extracted from isodensity maps.

  10. Full-Chip Layout Optimization for Process Margin Enhancement Using Model-Based Hotspot Fixing System

    NASA Astrophysics Data System (ADS)

    Kobayashi, Sachiko; Kyoh, Suigen; Kotani, Toshiya; Takekawa, Yoko; Inoue, Soichi; Nakamae, Koji

    2010-06-01

    As the design rule of integrated circuits is shrinking rapidly, it is necessary to use low-k1 lithography technologies. With low-k1 lithography, even if aggressive optical proximity correction is adopted, many sites become marginless spots, known as “hotspots”. For this problem, hotspot fixer (HSF) in design-for-manufacturability flow has been studied. In our previous work, we indicated the feasibility of layout modification using a simple line/space sizing rule for metal layers in 65-nm-node logic devices. However, in view of the continuous design-rule shrinkage and design complication, a more flexible modification method has become necessary to fix various types of hotspots. In this work, we have developed a brute-force model-based HSF. To further reduce the processing time, the hybrid flow of rule- and model-based HSFs is studied. The feasibility of such hybrid flow is studied by applying it to the full-chip layout modification of a logic test chip.

  11. StarPlan: A model-based diagnostic system for spacecraft

    NASA Technical Reports Server (NTRS)

    Heher, Dennis; Pownall, Paul

    1990-01-01

    The Sunnyvale Division of Ford Aerospace created a model-based reasoning capability for diagnosing faults in space systems. The approach employs reasoning about a model of the domain (as it is designed to operate) to explain differences between expected and actual telemetry; i.e., to identify the root cause of the discrepancy (at an appropriate level of detail) and determine necessary corrective action. A development environment, named Paragon, was implemented to support both model-building and reasoning. The major benefit of the model-based approach is the capability for the intelligent system to handle faults that were not anticipated by a human expert. The feasibility of this approach for diagnosing problems in a spacecraft was demonstrated in a prototype system, named StarPlan. Reasoning modules within StarPlan detect anomalous telemetry, establish goals for returning the telemetry to nominal values, and create a command plan for attaining the goals. Before commands are implemented, their effects are simulated to assure convergence toward the goal. After the commands are issued, the telemetry is monitored to assure that the plan is successful. These features of StarPlan, along with associated concerns, issues and future directions, are discussed.

  12. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model based accelerator control started at SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical change with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed these application programs for a fourth time. This time, however, the programs we are developing are generic so that we will not have to do it again. We have developed an integrated system called GOLD (Generic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  13. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model-based accelerator control started at SPEAR. Since that time nearly all accelerator beamlines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP and SLC all use different control programs. Since many of these application programs are embedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed an integrated system called GOLD (Genetic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  14. Deconvolution with Correct Sampling

    NASA Astrophysics Data System (ADS)

    Magain, P.; Courbin, F.; Sohy, S.

    1998-02-01

    A new method for improving the resolution of astronomical images is presented. It is based on the principle that sampled data cannot be fully deconvolved without violating the sampling theorem. Thus, the sampled image should be deconvolved not by the total point-spread function but by a narrower function chosen so that the resolution of the deconvolved image is compatible with the adopted sampling. Our deconvolution method gives results that are, in at least some cases, superior to those of other commonly used techniques: in particular, it does not produce ringing around point sources superposed on a smooth background. Moreover, it allows researchers to perform accurate astrometry and photometry of crowded fields. These improvements are a consequence of both the correct treatment of sampling and the recognition that the most probable astronomical image is not a flat one. The method is also well adapted to the optimal combination of different images of the same object, as can be obtained, e.g., from infrared observations or via adaptive optics techniques.

  15. Thermodynamics of Error Correction

    NASA Astrophysics Data System (ADS)

    Sartori, Pablo; Pigolotti, Simone

    2015-10-01

    Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  16. When Does Model-Based Control Pay Off?

    PubMed Central

    2016-01-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to “model-free” and “model-based” strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  17. Mining the Genome for Therapeutic Targets.

    PubMed

    Florez, Jose C

    2017-07-01

    Current pharmacological options for type 2 diabetes do not cure the disease. Despite the availability of multiple drug classes that modulate glycemia effectively and minimize long-term complications, these agents do not reverse pathogenesis, and in practice they are not selected to correct the molecular profile specific to the patient. Pharmaceutical companies find drug development programs increasingly costly and burdensome, and many promising compounds fail before launch to market. Human genetics can help advance the therapeutic enterprise. Genomic discovery that is agnostic to preexisting knowledge has uncovered dozens of loci that influence glycemic dysregulation. Physiological investigation has begun to define disease subtypes, clarifying heterogeneity and suggesting molecular pathways for intervention. Convincing genetic associations have paved the way for the identification of effector transcripts that underlie the phenotype, and genetic or experimental proof of gain or loss of function in select cases has clarified the direction of effect to guide therapeutic development. Genetic studies can also examine off-target effects and furnish causal inference. As this information is curated and made widely available to all stakeholders, it is hoped that it will enhance therapeutic development pipelines by accelerating efficiency, maximizing cost-effectiveness, and raising ultimate success rates. © 2017 by the American Diabetes Association.

  18. Combination therapeutics in complex diseases.

    PubMed

    He, Bing; Lu, Cheng; Zheng, Guang; He, Xiaojuan; Wang, Maolin; Chen, Gao; Zhang, Ge; Lu, Aiping

    2016-12-01

    The biological redundancies in molecular networks of complex diseases limit the efficacy of many single drug therapies. Combination therapeutics, as a common therapeutic method, involve pharmacological intervention using several drugs that interact with multiple targets in the molecular networks of diseases and may achieve better efficacy and/or less toxicity than monotherapy in practice. The development of combination therapeutics is complicated by several critical issues, including identifying multiple targets, targeting strategies and the drug combination. This review summarizes the current achievements in combination therapeutics, with a particular emphasis on the efforts to develop combination therapeutics for complex diseases.

  19. Gravitational correction to vacuum polarization

    NASA Astrophysics Data System (ADS)

    Jentschura, U. D.

    2015-02-01

    We consider the gravitational correction to (electronic) vacuum polarization in the presence of a gravitational background field. The Dirac propagators for the virtual fermions are modified to include the leading gravitational correction (potential term) which corresponds to a coordinate-dependent fermion mass. The mass term is assumed to be uniform over a length scale commensurate with the virtual electron-positron pair. The on-mass shell renormalization condition ensures that the gravitational correction vanishes on the mass shell of the photon, i.e., the speed of light is unaffected by the quantum field theoretical loop correction, in full agreement with the equivalence principle. Nontrivial corrections are obtained for off-shell, virtual photons. We compare our findings to other works on generalized Lorentz transformations and combined quantum-electrodynamic gravitational corrections to the speed of light which have recently appeared in the literature.

  20. Individualized correction of insulin measurement in hemolyzed serum samples.

    PubMed

    Wu, Zhi-Qi; Lu, Ju; Chen, Huanhuan; Chen, Wensen; Xu, Hua-Guo

    2016-11-05

    Insulin measurement plays a key role in the investigation of patients with hypoglycemia, subtype classification of diabetes mellitus, insulin resistance, and impaired beta cell function. However, even slight hemolysis can negatively affect insulin measurement due to RBC insulin-degrading enzyme (IDE). Here, we derived and validated an individualized correction equation in an attempt to eliminate the effects of hemolysis on insulin measurement. The effects of hemolysis on insulin measurement were studied by adding lysed self-RBCs to serum. A correction equation was derived, accounting for both percentage and exposure time of hemolysis. The performance of this individualized correction was evaluated in intentionally hemolyzed samples. Insulin concentration decreased with increasing percentage and exposure time of hemolysis. Based on the effects of hemolysis on insulin measurement of 17 donors (baseline insulin concentrations ranged from 156 to 2119 pmol/L), the individualized hemolysis correction equation was derived: INScorr = INSmeas/(0.705lgHbplasma/Hbserum - 0.001Time - 0.612). This equation can revert insulin concentrations of the intentionally hemolyzed samples to values that were statistically not different from the corresponding insulin baseline concentrations (p = 0.1564). Hemolysis could lead to a negative interference on insulin measurement; by individualized hemolysis correction equation for insulin measurement, we can correct and report reliable serum insulin results for a wide range of degrees of sample hemolysis. This correction would increase diagnostic accuracy, reduce inappropriate therapeutic decisions, and improve patient satisfaction with care.

  1. Non-linear control logics for vibrations suppression: a comparison between model-based and non-model-based techniques

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Orsini, Lorenzo; Resta, Ferruccio

    2015-04-01

    Non-linear behavior is present in many mechanical system operating conditions. In these cases, a common engineering practice is to linearize the equation of motion around a particular operating point, and to design a linear controller. The main disadvantage is that the stability properties and validity of the controller are local. In order to improve the controller performance, non-linear control techniques represent a very attractive solution for many smart structures. The aim of this paper is to compare non-linear model-based and non-model-based control techniques. In particular the model-based sliding-mode-control (SMC) technique is considered because of its easy implementation and the strong robustness of the controller even under heavy model uncertainties. Among the non-model-based control techniques, the fuzzy control (FC), allowing designing the controller according to if-then rules, has been considered. It defines the controller without a system reference model, offering many advantages such as an intrinsic robustness. These techniques have been tested on the pendulum nonlinear system.

  2. Processor register error correction management

    DOEpatents

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  3. Antibody Engineering and Therapeutics

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Breden, Felix; Scott, Jamie K; Sok, Devin; Pauthner, Matthias; Reichert, Janice M; Helguera, Gustavo; Andrabi, Raiees; Mabry, Robert; Bléry, Mathieu; Voss, James E; Laurén, Juha; Abuqayyas, Lubna; Barghorn, Stefan; Ben-Jacob, Eshel; Crowe, James E; Huston, James S; Johnston, Stephen Albert; Krauland, Eric; Lund-Johansen, Fridtjof; Marasco, Wayne A; Parren, Paul WHI; Xu, Kai Y

    2014-01-01

    The 24th Antibody Engineering & Therapeutics meeting brought together a broad range of participants who were updated on the latest advances in antibody research and development. Organized by IBC Life Sciences, the gathering is the annual meeting of The Antibody Society, which serves as the scientific sponsor. Preconference workshops on 3D modeling and delineation of clonal lineages were featured, and the conference included sessions on a wide variety of topics relevant to researchers, including systems biology; antibody deep sequencing and repertoires; the effects of antibody gene variation and usage on antibody response; directed evolution; knowledge-based design; antibodies in a complex environment; polyreactive antibodies and polyspecificity; the interface between antibody therapy and cellular immunity in cancer; antibodies in cardiometabolic medicine; antibody pharmacokinetics, distribution and off-target toxicity; optimizing antibody formats for immunotherapy; polyclonals, oligoclonals and bispecifics; antibody discovery platforms; and antibody-drug conjugates. PMID:24589717

  4. Outpatient therapeutic nuclear oncology.

    PubMed

    Turner, J Harvey

    2012-05-01

    In the beginning, nuclear medicine was radionuclide therapy, which has evolved into molecular tumour-targeted control of metastatic cancer. Safe, efficacious, clinical practice of therapeutic nuclear oncology may now be based upon accurate personalised dosimetry by quantitative gamma SPECT/CT imaging to prescribe tumoricidal activities without critical organ toxicity. Preferred therapy radionuclides possess gamma emission of modest energy and abundance to enable quantitative SPECT/CT imaging for calculation of the beta therapy dosimetry, without radiation exposure risk to hospital personnel, carers, family or members of the public. The safety of outpatient radiopharmaceutical therapy of cancer with Iodine-131, Samarium-153, Holmium-166, Rhenium-186, Rhenium-188, Lutetium-177 and Indium-111 is reviewed. Measured activity release rates and radiation exposure to carers and the public are all within recommendations and guidelines of international regulatory agencies and, when permitted by local regulatory authorities allow cost-effective, safe, outpatient radionuclide therapy of cancer without isolation in hospital.

  5. Antimicrobial peptides: therapeutic potentials.

    PubMed

    Kang, Su-Jin; Park, Sung Jean; Mishig-Ochir, Tsogbadrakh; Lee, Bong-Jin

    2014-12-01

    The increasing appearance of multidrug-resistant pathogens has created an urgent need for suitable alternatives to current antibiotics. Antimicrobial peptides (AMPs), which act as defensive weapons against microbes, have received great attention because of broad-spectrum activities, unique action mechanisms and rare antibiotic-resistant variants. Despite desirable characteristics, they have shown limitations in pharmaceutical development due to toxicity, stability and manufacturing costs. Because of these drawbacks, only a few AMPs have been tested in Phase III clinical trials and no AMPs have been approved by the US FDA yet. However, these obstacles could be overcome by well-known methods such as changing physicochemical characteristics and introducing nonnatural amino acids, acetylation or amidation, as well as modern techniques like molecular targeted AMPs, liposomal formulations and drug delivery systems. Thus, the current challenge in this field is to develop therapeutic AMPs at a reasonable cost as well as to overcome the limitations.

  6. Mitochondrial Energetics and Therapeutics

    PubMed Central

    Wallace, Douglas C.; Fan, Weiwei; Procaccio, Vincent

    2011-01-01

    Mitochondrial dysfunction has been linked to a wide range of degenerative and metabolic diseases, cancer, and aging. All these clinical manifestations arise from the central role of bioenergetics in cell biology. Although genetic therapies are maturing as the rules of bioenergetic genetics are clarified, metabolic therapies have been ineffectual. This failure results from our limited appreciation of the role of bioenergetics as the interface between the environment and the cell. A systems approach, which, ironically, was first successfully applied over 80 years ago with the introduction of the ketogenic diet, is required. Analysis of the many ways that a shift from carbohydrate glycolytic metabolism to fatty acid and ketone oxidative metabolism may modulate metabolism, signal transduction pathways, and the epigenome gives us an appreciation of the ketogenic diet and the potential for bioenergetic therapeutics. PMID:20078222

  7. Aptamers in Therapeutics

    PubMed Central

    2016-01-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  8. Microfabricated therapeutic actuators

    DOEpatents

    Lee, Abraham P.; Northrup, M. Allen; Ciarlo, Dino R.; Krulevitch, Peter A.; Benett, William J.

    1999-01-01

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use.

  9. Microfabricated therapeutic actuators

    DOEpatents

    Lee, A.P.; Northrup, M.A.; Ciarlo, D.R.; Krulevitch, P.A.; Benett, W.J.

    1999-06-15

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use. 8 figs.

  10. Antioxidant therapeutics: Pandora's box.

    PubMed

    Day, Brian J

    2014-01-01

    Evolution has favored the utilization of dioxygen (O2) in the development of complex multicellular organisms. O2 is actually a toxic mutagenic gas that is highly oxidizing and combustible. It is thought that plants are largely to blame for polluting the earth's atmosphere with O2 owing to the development of photosynthesis by blue-green algae over 2 billion years ago. The rise of the plants and atmospheric O2 levels placed evolutionary stress on organisms to adapt or become extinct. This implies that all the surviving creatures on our planet are mutants that have adapted to the "abnormal biology" of O2. Much of the adaptation to the presence of O2 in biological systems comes from well-coordinated antioxidant and repair systems that focus on converting O2 to its most reduced form, water (H2O), and the repair and replacement of damaged cellular macromolecules. Biological systems have also harnessed O2's reactive properties for energy production, xenobiotic metabolism, and host defense and as a signaling messenger and redox modulator of a number of cell signaling pathways. Many of these systems involve electron transport systems and offer many different mechanisms by which antioxidant therapeutics can alternatively produce an antioxidant effect without directly scavenging oxygen-derived reactive species. It is likely that each agent will have a different set of mechanisms that may change depending on the model of oxidative stress, organ system, or disease state. An important point is that all biological processes of aerobes have coevolved with O2 and this creates a Pandora's box for trying to understand the mechanism(s) of action of antioxidants being developed as therapeutic agents. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. In Situ Mosaic Brightness Correction

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.; Lorre, Jean J.

    2012-01-01

    In situ missions typically have pointable, mast-mounted cameras, which are capable of taking panoramic mosaics comprised of many individual frames. These frames are mosaicked together. While the mosaic software applies radiometric correction to the images, in many cases brightness/contrast seams still exist between frames. This is largely due to errors in the radiometric correction, and the absence of correction for photometric effects in the mosaic processing chain. The software analyzes the overlaps between adjacent frames in the mosaic and determines correction factors for each image in an attempt to reduce or eliminate these brightness seams.

  12. A panic attack in therapeutic recreation over being considered therapeutic.

    PubMed

    Lee, L L

    1987-01-01

    Ancillary professions have been called upon to account for therapeutic benefits from their services or be eliminated from the health care system. A singular focus on therapy, however, would negate the unique contribution of therapeutic recreation within, while simultaneously restricting services to health care settings. It is proposed that panic over therapeutic recreation services meeting health care goals has hindered evaluation and solidification of the leisure-based philosophy presented in the NTRS Philosophical Position Statement (NTRS, 1982). It is argued that emphasizing the leisure orientation of the philosophical position statement can secure therapeutic recreation's position within, yet, not deny services to those outside of the health care system. An overview is presented on the adequacy of the position statement philosophy for therapeutic recreation. A potential danger of attempting to explain therapeutic recreation in terms of non-leisure based philosophies is also discussed.

  13. [Systems analysis of colour music corrective effect].

    PubMed

    Gumeniuk, V A; Batova, N Ia; Mel'nikova, T S; Glazachev, O S; Golubeva, N K; Klimina, N V; Hubner, P

    1998-01-01

    In the context of P. K. Anokhin's theory of functional systems, the corrective effects of various combinations of medical therapeutical resonance music (MTRM) and dynamic colour exposure were analyzed. As compared to rehabilitative music programmes, MRTM was shown to have a more pronounced relaxing effect as manifested both in the optimization of emotion and in the activity of autonomic regulation of cardiovascular functions. On combined MRTM and dynamic colour flow exposures, the relaxing effect is most marked. In the examinees, the personality and situation anxieties diminish, mood improves, cardiovascular parameters become normal, the rate of metabolic processes and muscular rigidity reduce, the spectral power of alpha-rhythm increases, these occurring predominantly in the anterior region of the brain. The findings suggest the high efficiency of the chosen way of normalizing the functional status of man.

  14. New orbit correction method uniting global and local orbit corrections

    NASA Astrophysics Data System (ADS)

    Nakamura, N.; Takaki, H.; Sakai, H.; Satoh, M.; Harada, K.; Kamiya, Y.

    2006-01-01

    A new orbit correction method, called the eigenvector method with constraints (EVC), is proposed and formulated to unite global and local orbit corrections for ring accelerators, especially synchrotron radiation(SR) sources. The EVC can exactly correct the beam positions at arbitrarily selected ring positions such as light source points, simultaneously reducing closed orbit distortion (COD) around the whole ring. Computer simulations clearly demonstrate these features of the EVC for both cases of the Super-SOR light source and the Advanced Light Source (ALS) that have typical structures of high-brilliance SR sources. In addition, the effects of errors in beam position monitor (BPM) reading and steering magnet setting on the orbit correction are analytically expressed and also compared with the computer simulations. Simulation results show that the EVC is very effective and useful for orbit correction and beam position stabilization in SR sources.

  15. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  16. Emerging Mitochondrial Therapeutic Targets in Optic Neuropathies.

    PubMed

    Lopez Sanchez, M I G; Crowston, J G; Mackey, D A; Trounce, I A

    2016-09-01

    Optic neuropathies are an important cause of blindness worldwide. The study of the most common inherited mitochondrial optic neuropathies, Leber hereditary optic neuropathy (LHON) and autosomal dominant optic atrophy (ADOA) has highlighted a fundamental role for mitochondrial function in the survival of the affected neuron-the retinal ganglion cell. A picture is now emerging that links mitochondrial dysfunction to optic nerve disease and other neurodegenerative processes. Insights gained from the peculiar susceptibility of retinal ganglion cells to mitochondrial dysfunction are likely to inform therapeutic development for glaucoma and other common neurodegenerative diseases of aging. Despite it being a fast-evolving field of research, a lack of access to human ocular tissues and limited animal models of mitochondrial disease have prevented direct retinal ganglion cell experimentation and delayed the development of efficient therapeutic strategies to prevent vision loss. Currently, there are no approved treatments for mitochondrial disease, including optic neuropathies caused by primary or secondary mitochondrial dysfunction. Recent advances in eye research have provided important insights into the molecular mechanisms that mediate pathogenesis, and new therapeutic strategies including gene correction approaches are currently being investigated. Here, we review the general principles of mitochondrial biology relevant to retinal ganglion cell function and provide an overview of the major optic neuropathies with mitochondrial involvement, LHON and ADOA, whilst highlighting the emerging link between mitochondrial dysfunction and glaucoma. The pharmacological strategies currently being trialed to improve mitochondrial dysfunction in these optic neuropathies are discussed in addition to emerging therapeutic approaches to preserve retinal ganglion cell function. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  18. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  19. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  20. Model-based inversion for a shallow ocean application

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-03-01

    A model-based approach to invert or estimate the sound speed profile (SSP) from noisy pressure-field measurements is discussed. The resulting model-based processor (MBP) is based on the state-space representation of the normal-mode propagation model. Using data obtained from the well-known Hudson Canyon experiment, a noisy shallow water ocean environment, the processor is designed and the results compared to those predicted using various propagation models and data. It is shown that the MBP not only predicts the sound speed quite well, but also is able to simultaneously provide enhanced estimates of both modal and pressure-field measurements which are useful for localization and rapid ocean environmental characterization.

  1. Model based control of a rehabilitation robot for lower extremities.

    PubMed

    Xie, Xiao-Liang; Hou, Zeng-Guang; Li, Peng-Feng; Ji, Cheng; Zhang, Feng; Tan, Min; Wang, Hongbo; Hu, Guoqing

    2010-01-01

    This paper mainly focuses on the trajectory tracking control of a lower extremity rehabilitation robot during passive training process of patients. Firstly, a mathematical model of the rehabilitation robot is introduced by using Lagrangian analysis. Then, a model based computed-torque control scheme is designed to control the constrained four-link robot (with patient's foot fixed on robot's end-effector) to track a predefined trajectory. Simulation results are provided to illustrate the effectiveness of the proposed model based computed-torque algorithm. In the simulation, a multi-body dynamics and motion software named ADAMS is used. The combined simulation of ADAMS and MATLAB is able to produce more realistic results of this complex integrated system.

  2. Hierarchical model-based interferometric synthetic aperture radar image registration

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Huang, Haifeng; Dong, Zhen; Wu, Manqing

    2014-01-01

    With the rapid development of spaceborne interferometric synthetic aperture radar technology, classical image registration methods are incompetent for high-efficiency and high-accuracy masses of real data processing. Based on this fact, we propose a new method. This method consists of two steps: coarse registration that is realized by cross-correlation algorithm and fine registration that is realized by hierarchical model-based algorithm. Hierarchical model-based algorithm is a high-efficiency optimization algorithm. The key features of this algorithm are a global model that constrains the overall structure of the motion estimated, a local model that is used in the estimation process, and a coarse-to-fine refinement strategy. Experimental results from different kinds of simulated and real data have confirmed that the proposed method is very fast and has high accuracy. Comparing with a conventional cross-correlation method, the proposed method provides markedly improved performance.

  3. Outlier Identification in Model-Based Cluster Analysis.

    PubMed

    Evans, Katie; Love, Tanzy; Thurston, Sally W

    2015-04-01

    In model-based clustering based on normal-mixture models, a few outlying observations can influence the cluster structure and number. This paper develops a method to identify these, however it does not attempt to identify clusters amidst a large field of noisy observations. We identify outliers as those observations in a cluster with minimal membership proportion or for which the cluster-specific variance with and without the observation is very different. Results from a simulation study demonstrate the ability of our method to detect true outliers without falsely identifying many non-outliers and improved performance over other approaches, under most scenarios. We use the contributed R package MCLUST for model-based clustering, but propose a modified prior for the cluster-specific variance which avoids degeneracies in estimation procedures. We also compare results from our outlier method to published results on National Hockey League data.

  4. Outlier Identification in Model-Based Cluster Analysis

    PubMed Central

    Evans, Katie; Love, Tanzy; Thurston, Sally W.

    2015-01-01

    In model-based clustering based on normal-mixture models, a few outlying observations can influence the cluster structure and number. This paper develops a method to identify these, however it does not attempt to identify clusters amidst a large field of noisy observations. We identify outliers as those observations in a cluster with minimal membership proportion or for which the cluster-specific variance with and without the observation is very different. Results from a simulation study demonstrate the ability of our method to detect true outliers without falsely identifying many non-outliers and improved performance over other approaches, under most scenarios. We use the contributed R package MCLUST for model-based clustering, but propose a modified prior for the cluster-specific variance which avoids degeneracies in estimation procedures. We also compare results from our outlier method to published results on National Hockey League data. PMID:26806993

  5. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  6. REAL-TIME MODEL-BASED ELECTRICAL POWERED WHEELCHAIR CONTROL

    PubMed Central

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G.; Ding, Dan; Cooper, Rory A.

    2009-01-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel-slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3-D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds. PMID:19733494

  7. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  8. Model-based reinforcement learning with dimension reduction.

    PubMed

    Tangkaratt, Voot; Morimoto, Jun; Sugiyama, Masashi

    2016-12-01

    The goal of reinforcement learning is to learn an optimal policy which controls an agent to acquire the maximum cumulative reward. The model-based reinforcement learning approach learns a transition model of the environment from data, and then derives the optimal policy using the transition model. However, learning an accurate transition model in high-dimensional environments requires a large amount of data which is difficult to obtain. To overcome this difficulty, in this paper, we propose to combine model-based reinforcement learning with the recently developed least-squares conditional entropy (LSCE) method, which simultaneously performs transition model estimation and dimension reduction. We also further extend the proposed method to imitation learning scenarios. The experimental results show that policy search combined with LSCE performs well for high-dimensional control tasks including real humanoid robot control.

  9. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Rudokas, Mary R.

    1988-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Office is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control, and trend analysis of the Space Station Thermal Control System (TCS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined here with examples from the thermal system to highlight the motivating factors behind them, followed by an overview of the capabilities of MTK, which was developed to address these issues in a generic fashion.

  10. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  11. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  12. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  14. Real-time model based electrical powered wheelchair control.

    PubMed

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G; Ding, Dan; Cooper, Rory A

    2009-12-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds.

  15. 3-D model-based tracking for UAV indoor localization.

    PubMed

    Teulière, Céline; Marchand, Eric; Eck, Laurent

    2015-05-01

    This paper proposes a novel model-based tracking approach for 3-D localization. One main difficulty of standard model-based approach lies in the presence of low-level ambiguities between different edges. In this paper, given a 3-D model of the edges of the environment, we derive a multiple hypotheses tracker which retrieves the potential poses of the camera from the observations in the image. We also show how these candidate poses can be integrated into a particle filtering framework to guide the particle set toward the peaks of the distribution. Motivated by the UAV indoor localization problem where GPS signal is not available, we validate the algorithm on real image sequences from UAV flights.

  16. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  17. Model-based control of fuel cells:. (1) Regulatory control

    NASA Astrophysics Data System (ADS)

    Golbert, Joshua; Lewin, Daniel R.

    This paper describes a model-based controller for the regulation of a proton exchange membrane (PEM) fuel cell. The model accounts for spatial dependencies of voltage, current, material flows, and temperatures in the fuel channel. Analysis of the process model shows that the effective gain of the process undergoes a sign change in the normal operating range of the fuel cell, indicating that it cannot be stabilized using a linear controller with integral action. Consequently, a nonlinear model-predictive-controller based on a simplified model has been developed, enabling the use of optimal control to satisfy power demands robustly. The models and controller have been realized in the MATLAB and SIMULINK environment. Initial results indicate improved performance and robustness when using model-based control in comparison with that obtained using an adaptive controller.

  18. Model-based hierarchical reinforcement learning and human action control

    PubMed Central

    Botvinick, Matthew; Weinstein, Ari

    2014-01-01

    Recent work has reawakened interest in goal-directed or ‘model-based’ choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour. PMID:25267822

  19. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  20. Model based control of dynamic atomic force microscope

    SciTech Connect

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  1. Model-Based Sensor Selection for Helicopter Gearbox Monitoring

    DTIC Science & Technology

    1996-04-01

    fault diagnosis of helicopter gearboxes is therefore necessary to prevent major breakdowns due to progression of undetected...in the gearbox . Once the presence of a fault is prompted by the fault detection network, fault diagnosis is performed by the Structure-Based...Components Figure 3: Overview of fault detection and diagnosis in the proposed model-based di- agnostic system for helicopter gearboxes . the OH-58A gearbox

  2. GENI: A graphical environment for model-based control

    SciTech Connect

    Kleban, S.; Lee, M.; Zambre, Y.

    1989-10-01

    A new method to operate machine and beam simulation programs for accelerator control has been developed. Existing methods, although cumbersome, have been used in control systems for commissioning and operation of many machines. We developed GENI, a generalized graphical interface to these programs for model-based control. This object-oriented''-like environment is described and some typical applications are presented. 4 refs., 5 figs.

  3. GENI: A graphical environment for model-based control

    NASA Astrophysics Data System (ADS)

    Kleban, Stephen; Lee, Martin; Zambre, Yadunath

    1990-08-01

    A new method of operating machine-modeling and beam-simulation programs for accelerator control has been developed. Existing methods, although cumbersome, have been used in control systems for commissioning and operation of many machines. We developed GENI, a generalized graphical interface to these programs for model-based control. This "object-oriented"-like environment is described and some typical applications are presented.

  4. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  5. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  6. Model based control of dynamic atomic force microscope

    NASA Astrophysics Data System (ADS)

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H∞ control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  7. A Nursing Practice Model Based on Christ: The Agape Model.

    PubMed

    Eckerd, Nancy

    2017-06-07

    Nine out of 10 American adults believe Jesus was a real person, and almost two-thirds have made a commitment to Jesus Christ. Research further supports that spiritual beliefs and religious practices influence overall health and well-being. Christian nurses need a practice model that helps them serve as kingdom nurses. This article introduces the Agape Model, based on the agape love and characteristics of Christ, upon which Christian nurses may align their practice to provide Christ-centered care.

  8. Model based control of dynamic atomic force microscope.

    PubMed

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  9. The limited usefulness of models based on recollection and familiarity.

    PubMed

    Wais, Peter E

    2013-04-01

    A recent report concluded that magnetoencephalographic signals of neural activity associated with memory based on the recollection process are independent from signals associated with memory based on the familiarity process. These data can be interpreted equally well, however, as indications of memory aggregated from both processes and showing that signals associated with high-confidence recognition are dissociable from signals associated with low-confidence recognition. The usefulness of interpreting neural data according to psychological models based on recollection and familiarity is discussed.

  10. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate.

  11. Gaussian model-based partitioning using iterated local search.

    PubMed

    Brusco, Michael J; Shireman, Emilie; Steinley, Douglas; Brudvig, Susan; Cradit, J Dennis

    2017-02-01

    The emergence of Gaussian model-based partitioning as a viable alternative to K-means clustering fosters a need for discrete optimization methods that can be efficiently implemented using model-based criteria. A variety of alternative partitioning criteria have been proposed for more general data conditions that permit elliptical clusters, different spatial orientations for the clusters, and unequal cluster sizes. Unfortunately, many of these partitioning criteria are computationally demanding, which makes the multiple-restart (multistart) approach commonly used for K-means partitioning less effective as a heuristic solution strategy. As an alternative, we propose an approach based on iterated local search (ILS), which has proved effective in previous combinatorial data analysis contexts. We compared multistart, ILS and hybrid multistart-ILS procedures for minimizing a very general model-based criterion that assumes no restrictions on cluster size or within-group covariance structure. This comparison, which used 23 data sets from the classification literature, revealed that the ILS and hybrid heuristics generally provided better criterion function values than the multistart approach when all three methods were constrained to the same 10-min time limit. In many instances, these differences in criterion function values reflected profound differences in the partitions obtained.

  12. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  13. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  14. Model-based pattern dummy generation for logic devices

    NASA Astrophysics Data System (ADS)

    Jang, Jongwon; Kim, Cheolkyun; Ko, Sungwoo; Byun, Seokyoung; Yang, Hyunjo; Yim, Donggyu

    2014-03-01

    The insertion of SRAF(Sub-Resolution Assist Feature) is one of the most frequently used method to enlarge the process window area. In most cases, the size of SRAF is proportional to the focus margin of drawn patterns. However, there is a trade-off between the SRAF size and SRAF printing, because SRAF is not supposed to be patterned on a wafer. For this reason, a lot of OPC engineers have been tried to put bigger and more SRAFs within the limits of the possible. The fact that many papers about predicting SRAF printability have been published recent years reflects this circumstance. Pattern dummy is inserted to enhance the lithographic process margin and CD uniformity unlike CMP dummy for uniform metal line height. It is ordinary to put pattern dummy at the designated location under consideration of the pitch of real patterns at design step. However, it is not always desirable to generate pattern dummies based on rules at the lithographic point of view. In this paper, we introduce the model based pattern dummy insertion method, which is putting pattern dummies at the location that model based SRAF is located. We applied the model based pattern dummy to the layers in logic devices, and studied which layer is more efficient for the insertion of dummies.

  15. Feature Referenced Error Correction Apparatus.

    DTIC Science & Technology

    A feature referenced error correction apparatus utilizing the multiple images of the interstage level image format to compensate for positional...images and by the generation of an error correction signal in response to the sub-frame registration errors. (Author)

  16. Diamagnetic Corrections and Pascal's Constants

    ERIC Educational Resources Information Center

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  17. Diamagnetic Corrections and Pascal's Constants

    ERIC Educational Resources Information Center

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  18. Corrections Education Evaluation System Model.

    ERIC Educational Resources Information Center

    Nelson, Orville; And Others

    The purpose of this project was to develop an evaluation system for the competency-based vocational program developed by Wisconsin's Division of Corrections, Department of Public Instruction (DPI), and the Vocational, Technical, and Adult Education System (VTAE). Site visits were conducted at five correctional institutions in March and April of…

  19. Unsupervised exposure correction for video

    NASA Astrophysics Data System (ADS)

    Petrova, X.; Sedunov, S.; Ignatov, A.

    2009-02-01

    The paper describes an "off-the-shelf" algorithmic solution for unsupervised exposure correction for video. An important feature of the algorithm is accurate processing not only for natural video sequences, but also for edited, rendered or combined content, including content with letter-boxes or pillar-boxes captured from TV broadcasts. The algorithm allows to change degree of exposure correction smoothly for continuous video scenes and to change it promptly on cuts. Solution includes scene change detection, letter-box detection, pillar-box detection, exposure correction adaptation, exposure correction and color correction. Exposure correction adaptation is based on histogram analysis and soft logics inference. Decision rules are based on relative number of entries in the low tones, mid tones and highlights, maximum entries in the low tones and mid tones, number of non-empty histogram entries and width of the middle range of the histogram. All decision rules have physical meaning, which allows to tune parameters easily for display devices of different classes. Exposure correction consists of computation of local average using edge-preserving filtering, applying local tone mapping and post-processing. At the final stage color correction aiming to reduce color distortions is applied.

  20. 75 FR 70951 - Notice, Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... From the Federal Register Online via the Government Publishing Office NATIONAL COUNCIL ON DISABILITY (NCD) Sunshine Act Meetings Notice, Correction Type: Quarterly Meeting. Summary: NCD published a...., Suite 850, Washington, DC 20004; 202-272-2004 (voice), 202-272-2074 TTY; 202-272-2022 Fax. Correction...

  1. Error Correction, Revision, and Learning

    ERIC Educational Resources Information Center

    Truscott, John; Hsu, Angela Yi-ping

    2008-01-01

    Previous research has shown that corrective feedback on an assignment helps learners reduce their errors on that assignment during the revision process. Does this finding constitute evidence that learning resulted from the feedback? Differing answers play an important role in the ongoing debate over the effectiveness of error correction,…

  2. Correcting Slightly Less Simple Movements

    ERIC Educational Resources Information Center

    Aivar, M. P.; Brenner, E.; Smeets, J. B. J.

    2005-01-01

    Many studies have analysed how goal directed movements are corrected in response to changes in the properties of the target. However, only simple movements to single targets have been used in those studies, so little is known about movement corrections under more complex situations. Evidence from studies that ask for movements to several targets…

  3. Barometric and Earth Tide Correction

    SciTech Connect

    Toll, Nathaniel J.

    2005-11-10

    BETCO corrects for barometric and earth tide effects in long-term water level records. A regression deconvolution method is used ot solve a series of linear equations to determine an impulse response function for the well pressure head. Using the response function, a pressure head correction is calculated and applied.

  4. Therapeutic use of nicergoline.

    PubMed

    Winblad, Bengt; Fioravanti, Mario; Dolezal, Tomas; Logina, Inara; Milanov, Ivan Gospodinov; Popescu, Dinu Cristian; Solomon, Alina

    2008-01-01

    The ergot alkaloid derivative nicergoline became clinically available about 35 years ago in the 1970s. Nicergoline has a broad spectrum of action: (i) as an alpha(1)-adrenoceptor antagonist, it induces vasodilation and increases arterial blood flow; (ii) it enhances cholinergic and catecholaminergic neurotransmitter function; (iii) it inhibits platelet aggregation; (iv) it promotes metabolic activity, resulting in increased utilization of oxygen and glucose; and (v) it has neurotrophic and antioxidant properties. Acting on several basic pathophysiological mechanisms, nicergoline has therapeutic potential in a number of disorders. This article provides an overview of the published clinical evidence relating to the efficacy and safety of nicergoline (30 mg twice daily) in the treatment of dementia (including Alzheimer's disease and vascular dementia) and vascular and balance disorders. For dementia of different aetiologies, the therapeutic benefit of nicergoline has been established, with up to 89% of patients showing improvements in cognition and behaviour. After as little as 2 months of treatment, symptom improvement is apparent compared with placebo, and most patients are still improved or stable after 12 months. Concomitant neurophysiological changes in the brain indicate (after only 4-8 weeks' treatment) improved vigilance and information processing. In patients with balance disorders, mean improvements of 44-78% in symptom severity and quality of life have been observed with nicergoline. Although clinical experience with nicergoline in vascular disorders is limited to relatively short-term, small-scale studies, it has been successfully used in rehabilitation therapy of patients with chronic ischaemic stroke. Open-label evaluations suggest that nicergoline may also be valuable in glaucoma, depression and peripheral arterio-pathy. Adverse events of nicergoline, if any, are related to the central nervous system, the metabolic system and the overall body. Most are

  5. [Nuclear transfer and therapeutic cloning].

    PubMed

    Xu, Xiao-Ming; Lei, An-Min; Hua, Jin-Lian; Dou, Zhong-Ying

    2005-03-01

    Nuclear transfer and therapeutic cloning have widespread and attractive prospects in animal agriculture and biomedical applications. We reviewed that the quality of oocytes and nuclear reprogramming of somatic donor cells were the main reasons of the common abnormalities in cloned animals and the low efficiency of cloning and showed the problems and outlets in therapeutic cloning, such as some basic problems in nuclear transfer affected clinical applications of therapeutic cloning. Study on isolation and culture of nuclear transfer embryonic stem (ntES) cells and specific differentiation of ntES cells into important functional cells should be emphasized and could enhance the efficiency. Adult stem cells could help to cure some great diseases, but could not replace therapeutic cloning. Ethics also impeded the development of therapeutic cloning. It is necessary to improve many techniques and reinforce the research of some basic theories, then somatic nuclear transfer and therapeutic cloning may apply to agriculture reproduction and benefit to human life better.

  6. Therapeutic touch: a healing modality.

    PubMed

    Mulloney, S S; Wells-Federman, C

    1996-04-01

    Therapeutic touch is a nursing intervention pioneered more than 20 years ago. A substantial body of literature encompassing theory, clinical practice, and research exists on this energetic healing modality. This article examines the scientific basis for healing through the human energy field, including the basic assumptions from which therapeutic touch was developed and a summary of related research. It also discusses integration of therapeutic touch into clinical practice and identifies resources for further exploration.

  7. Automated model-based calibration of imaging spectrographs

    NASA Astrophysics Data System (ADS)

    Kosec, Matjaž; Bürmen, Miran; Tomaževič, Dejan; Pernuš, Franjo; Likar, Boštjan

    2012-03-01

    Hyper-spectral imaging has gained recognition as an important non-invasive research tool in the field of biomedicine. Among the variety of available hyperspectral imaging systems, systems comprising an imaging spectrograph, lens, wideband illumination source and a corresponding camera stand out for the short acquisition time and good signal to noise ratio. The individual images acquired by imaging spectrograph-based systems contain full spectral information along one spatial dimension. Due to the imperfections in the camera lens and in particular the optical components of the imaging spectrograph, the acquired images are subjected to spatial and spectral distortions, resulting in scene dependent nonlinear spectral degradations and spatial misalignments which need to be corrected. However, the existing correction methods require complex calibration setups and a tedious manual involvement, therefore, the correction of the distortions is often neglected. Such simplified approach can lead to significant errors in the analysis of the acquired hyperspectral images. In this paper, we present a novel fully automated method for correction of the geometric and spectral distortions in the acquired images. The method is based on automated non-rigid registration of the reference and acquired images corresponding to the proposed calibration object incorporating standardized spatial and spectral information. The obtained transformation was successfully used for sub-pixel correction of various hyperspectral images, resulting in significant improvement of the spectral and spatial alignment. It was found that the proposed calibration is highly accurate and suitable for routine use in applications involving either diffuse reflectance or transmittance measurement setups.

  8. Chin Ptosis: Classification, Anatomy, and Correction

    PubMed Central

    Garfein, Evan S.; Zide, Barry M.

    2008-01-01

    For years, the notion of chin ptosis was somehow integrated with the concept of witch's chin. That was a mistake on many levels because chin droop has four major causes, all different and with some overlap. With this article, the surgeon can quickly diagnose which type and which therapeutic modality would work best. In some cases the problem is a simple fix, in others the droop can only be stabilized, and in the final two, definite corrective procedures are available. Of note, in certain situations two types of chin ptosis may overlap because both the patient and the surgeon may each contribute to the problems. For example, in dynamic ptosis, a droop that occurs with smile in the unoperated patient can be exacerbated and further produced by certain surgical methods also. This paper classifies the variations of the problems and explains the anatomy with the final emphasis on long-term surgical correction, well described herein. This article is the ninth on this subject and a review of them all would be helpful (greatly) for understanding the enigmas of the lower face. PMID:22110784

  9. Using rule-based shot dose assignment in model-based MPC applications

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  10. Homocystinuria: Therapeutic approach.

    PubMed

    Kumar, Tarun; Sharma, Gurumayum Suraj; Singh, Laishram Rajendrakumar

    2016-07-01

    Homocystinuria is a disorder of sulfur metabolism pathway caused by deficiency of cystathionine β-synthase (CBS). It is characterized by increased accumulation of homocysteine (Hcy) in the cells and plasma. Increased homocysteine results in various vascular and neurological complications. Present strategies to lower cellular and plasma homocysteine levels include vitamin B6 intake, dietary methionine restriction, betaine supplementation, folate and vitamin B12 administration. However, these strategies are inefficient for treatment of homocystinuria. In recent years, advances have been made towards developing new strategies to treat homocystinuria. These mainly include functional restoration to mutant CBS, enhanced clearance of Hcy from the body, prevention of N-homocysteinylation-induced toxicity and inhibition of homocysteine-induced oxidative stress. In this review, we have exclusively discussed the recent advances that have been achieved towards the treatment of homocystinuria. The review is an attempt to help clinicians in developing effective therapeutic strategies and designing novel drugs against homocystinuria. Copyright © 2016. Published by Elsevier B.V.

  11. Leech Therapeutic Applications

    PubMed Central

    Abdualkader, A. M.; Ghawi, A. M.; Alaama, M.; Awang, M.; Merzouk, A.

    2013-01-01

    Hematophagous animals including leeches have been known to possess biologically active compounds in their secretions, especially in their saliva. The blood-sucking annelids, leeches have been used for therapeutic purposes since the beginning of civilization. Ancient Egyptian, Indian, Greek and Arab physicians used leeches for a wide range of diseases starting from the conventional use for bleeding to systemic ailments, such as skin diseases, nervous system abnormalities, urinary and reproductive system problems, inflammation, and dental problems. Recently, extensive researches on leech saliva unveiled the presence of a variety of bioactive peptides and proteins involving antithrombin (hirudin, bufrudin), antiplatelet (calin, saratin), factor Xa inhibitors (lefaxin), antibacterial (theromacin, theromyzin) and others. Consequently, leech has made a comeback as a new remedy for many chronic and life-threatening abnormalities, such as cardiovascular problems, cancer, metastasis, and infectious diseases. In the 20th century, leech therapy has established itself in plastic and microsurgery as a protective tool against venous congestion and served to salvage the replanted digits and flaps. Many clinics for plastic surgery all over the world started to use leeches for cosmetic purposes. Despite the efficacious properties of leech therapy, the safety, and complications of leeching are still controversial. PMID:24019559

  12. Plasmids encoding therapeutic agents

    DOEpatents

    Keener, William K.

    2007-08-07

    Plasmids encoding anti-HIV and anti-anthrax therapeutic agents are disclosed. Plasmid pWKK-500 encodes a fusion protein containing DP178 as a targeting moiety, the ricin A chain, an HIV protease cleavable linker, and a truncated ricin B chain. N-terminal extensions of the fusion protein include the maltose binding protein and a Factor Xa protease site. C-terminal extensions include a hydrophobic linker, an L domain motif peptide, a KDEL ER retention signal, another Factor Xa protease site, an out-of-frame buforin II coding sequence, the lacZ.alpha. peptide, and a polyhistidine tag. More than twenty derivatives of plasmid pWKK-500 are described. Plasmids pWKK-700 and pWKK-800 are similar to pWKK-500 wherein the DP178-encoding sequence is substituted by RANTES- and SDF-1-encoding sequences, respectively. Plasmid pWKK-900 is similar to pWKK-500 wherein the HIV protease cleavable linker is substituted by a lethal factor (LF) peptide-cleavable linker.

  13. Therapeutic antibody technology 97.

    PubMed

    Larrick, J W; Gavilondo, J

    1998-01-01

    Almost 200 antibody aficionados attended the Therapeutic Antibody Technology 97 meeting, held September 21-24, 1997 at the Holiday Inn, Union Square in the heart of San Francisco, CA. The meeting was sponsored by the Palo Alto Institute of Molecular Medicine and organized by James W. Larrick (PAIMM) and Dennis R. Burton (Scripps Research Institute). The meeting featured excellent discussions on many interesting talks and a number of poster presentations. It is likely that another meeting will be organized in 2 years, however in the meantime, an effort is underway to organize a 'Virtual Antibody Society' to be set up on the web server at Scripps Research Institute in La Jolla, CA (Questions and comments on this project can be sent to: Jwlarrick@aol.com or Burton@scripps.edu). Richard Lerner (Scripps) gave the keynote address on 'Catalytic Antibodies', describing recent work with Carlos Barbas on so-called reactive immunization to generate a high activity aldolase catalytic antibody. This antibody, soon to be described in an article in Science, is the first commercially available catalytic antibody.

  14. Phytonutrients as therapeutic agents.

    PubMed

    Gupta, Charu; Prakash, Dhan

    2014-09-01

    Nutrients present in various foods plays an important role in maintaining the normal functions of the human body. The major nutrients present in foods include carbohydrates, proteins, lipids, vitamins, and minerals. Besides these, there are some bioactive food components known as "phytonutrients" that play an important role in human health. They have tremendous impact on the health care system and may provide medical health benefits including the prevention and/or treatment of disease and various physiological disorders. Phytonutrients play a positive role by maintaining and modulating immune function to prevent specific diseases. Being natural products, they hold a great promise in clinical therapy as they possess no side effects that are usually associated with chemotherapy or radiotherapy. They are also comparatively cheap and thus significantly reduce health care cost. Phytonutrients are the plant nutrients with specific biological activities that support human health. Some of the important bioactive phytonutrients include polyphenols, terpenoids, resveratrol, flavonoids, isoflavonoids, carotenoids, limonoids, glucosinolates, phytoestrogens, phytosterols, anthocyanins, ω-3 fatty acids, and probiotics. They play specific pharmacological effects in human health such as anti-microbial, anti-oxidants, anti-inflammatory, antiallergic, anti-spasmodic, anti-cancer, anti-aging, hepatoprotective, hypolipidemic, neuroprotective, hypotensive, diabetes, osteoporosis, CNS stimulant, analgesic, protection from UVB-induced carcinogenesis, immuno-modulator, and carminative. This mini-review attempts to summarize the major important types of phytonutrients and their role in promoting human health and as therapeutic agents along with the current market trend and commercialization.

  15. Designing phage therapeutics.

    PubMed

    Goodridge, Lawrence D

    2010-01-01

    Phage therapy is the application of phages to bodies, substances, or environments to effect the biocontrol of pathogenic or nuisance bacteria. To be effective, phages, minimally, must be capable of attaching to bacteria (adsorption), killing those bacteria (usually associated with phage infection), and otherwise surviving (resisting decay) until they achieve attachment and subsequent killing. While a strength of phage therapy is that phages that possess appropriate properties can be chosen from a large diversity of naturally occurring phages, a more rational approach to phage therapy also can include post-isolation manipulation of phages genetically, phenotypically, or in terms of combining different products into a single formulation. Genetic manipulation, especially in these modern times, can involve genetic engineering, though a more traditional approach involves the selection of spontaneously occurring phage mutants during serial transfer protocols. While genetic modification typically is done to give rise to phenotypic changes in phages, phage phenotype alone can also be modified in vitro, prior to phage application for therapeutic purposes, as for the sake of improving phage lethality (such as by linking phage virions to antibacterial chemicals such as chloramphenicol) or survival capabilities (e.g., via virion PEGylation). Finally, phages, both naturally occurring isolates or otherwise modified constructs, can be combined into cocktails which provide collectively enhanced capabilities such as expanded overall host range. Generally these strategies represent different routes towards improving phage therapy formulations and thereby efficacy through informed design.

  16. Therapeutic cloning: The ethical limits

    SciTech Connect

    Whittaker, Peter A. . E-mail: p.whittaker@lancaster.ac.uk

    2005-09-01

    A brief outline of stem cells, stem cell therapy and therapeutic cloning is given. The position of therapeutic cloning with regard to other embryonic manipulations - IVF-based reproduction, embryonic stem formation from IVF embryos and reproductive cloning - is indicated. The main ethically challenging stages in therapeutic cloning are considered to be the nuclear transfer process including the source of eggs for this and the destruction of an embryo to provide stem cells for therapeutic use. The extremely polarised nature of the debate regarding the status of an early human embryo is noted, and some potential alternative strategies for preparing immunocompatible pluripotent stem cells are indicated.

  17. Therapeutic cloning in the mouse

    PubMed Central

    Mombaerts, Peter

    2003-01-01

    Nuclear transfer technology can be applied to produce autologous differentiated cells for therapeutic purposes, a concept termed therapeutic cloning. Countless articles have been published on the ethics and politics of human therapeutic cloning, reflecting the high expectations from this new opportunity for rejuvenation of the aging or diseased body. Yet the research literature on therapeutic cloning, strictly speaking, is comprised of only four articles, all in the mouse. The efficiency of derivation of embryonic stem cell lines via nuclear transfer is remarkably consistent among these reports. However, the efficiency is so low that, in its present form, the concept is unlikely to become widespread in clinical practice. PMID:12949262

  18. Clinical applications of therapeutic phlebotomy

    PubMed Central

    Kim, Kyung Hee; Oh, Ki Young

    2016-01-01

    Phlebotomy is the removal of blood from the body, and therapeutic phlebotomy is the preferred treatment for blood disorders in which the removal of red blood cells or serum iron is the most efficient method for managing the symptoms and complications. Therapeutic phlebotomy is currently indicated for the treatment of hemochromatosis, polycythemia vera, porphyria cutanea tarda, sickle cell disease, and nonalcoholic fatty liver disease with hyperferritinemia. This review discusses therapeutic phlebotomy and the related disorders and also offers guidelines for establishing a therapeutic phlebotomy program. PMID:27486346

  19. Error Correction: Report on a Study

    ERIC Educational Resources Information Center

    Dabaghi, Azizollah

    2006-01-01

    This article reports on a study which investigated the effects of correction of learners' grammatical errors on acquisition. Specifically, it compared the effects of timing of correction (immediate versus delayed correction) and manner of correction (explicit versus implicit correction). It also investigated the relative effects of correction of…

  20. Therapeutic Devices for Epilepsy

    PubMed Central

    Fisher, Robert S.

    2011-01-01

    Therapeutic devices provide new options for treating drug-resistant epilepsy. These devices act by a variety of mechanisms to modulate neuronal activity. Only vagus nerve stimulation, which continues to develop new technology, is approved for use in the United States. Deep brain stimulation (DBS) of anterior thalamus for partial epilepsy recently was approved in Europe and several other countries. Responsive neurostimulation, which delivers stimuli to one or two seizure foci in response to a detected seizure, recently completed a successful multicenter trial. Several other trials of brain stimulation are in planning or underway. Transcutaneous magnetic stimulation (TMS) may provide a noninvasive method to stimulate cortex. Controlled studies of TMS split on efficacy, and may depend on whether a seizure focus is near a possible region for stimulation. Seizure detection devices in the form of “shake” detectors via portable accelerometers can provide notification of an ongoing tonic-clonic seizure, or peace of mind in the absence of notification. Prediction of seizures from various aspects of EEG is in early stages. Prediction appears to be possible in a subpopulation of people with refractory seizures and a clinical trial of an implantable prediction device is underway. Cooling of neocortex or hippocampus reversibly can attenuate epileptiform EEG activity and seizures, but engineering problems remain in its implementation. Optogenetics is a new technique that can control excitability of specific populations of neurons with light. Inhibition of epileptiform activity has been demonstrated in hippocampal slices, but use in humans will require more work. In general, devices provide useful palliation for otherwise uncontrollable seizures, but with a different risk profile than with most drugs. Optimizing the place of devices in therapy for epilepsy will require further development and clinical experience. PMID:22367987

  1. Epilepsy: Novel therapeutic targets

    PubMed Central

    Anovadiya, Ashish P.; Sanmukhani, Jayesh J.; Tripathi, C. B.

    2012-01-01

    Despite of established and effective therapy for epilepsy, 20–25% patients develop therapeutic failure; this encourages finding newer drugs. Novel approaches target receptors which remain unaffected by conventional therapy or inhibit epileptogenesis. AMPA receptor antagonists have shown faster and complete protection compared to diazepam. Protein kinase (PK) plays an important role in the development of epilepsy. PK inhibitors such as K252a, VID-82925, and Herbimycin A have been found effective in inhibition of spread of epileptiform activity and epileptogenesis. Metabotropic glutamate receptors (mGluRs) are G protein-coupled receptors classified into three groups. Group 1 mGluRs antagonist and Groups 2 and 3 mGluRs agonists inhibited pentylenetetrazole-induced kindled seizures. Combined use of these agents has also shown favorable results. Mammalian target of rapamycin (mTOR) plays a central role in multiple mechanisms of epileptogenesis. mTOR causes transcription, induction of proapoptotic proteins, and autophagy inhibition. Rapamycin was effective in suppression of recurrent seizures as well as in tuberous sclerosis and acute brain injury model. 5% CO2 showed potent effects on cortical epileptiform activity and convulsions in animal epilepsy models and in humans with drug-resistant partial epilepsy. It is found to be rapidly acting, safe and cheap, thus it can be a good option in emergency for suppression of seizure. Neurosteroids are considered as fourth generation neuromessengers, they act as positive allosteric modulators of γ-aminobutyric acid (GABAA) receptors. Clinical trial of ganaxolone, an allopregnanolone analogue, has shown a beneficial role in pharmacoresistant epilepsy. However, most of these drugs are tested in early phases of development and the possible use and safety in epilepsy has to be proven in clinical trials. PMID:22629084

  2. Purinergic Signalling: Therapeutic Developments

    PubMed Central

    Burnstock, Geoffrey

    2017-01-01

    Purinergic signalling, i.e., the role of nucleotides as extracellular signalling molecules, was proposed in 1972. However, this concept was not well accepted until the early 1990’s when receptor subtypes for purines and pyrimidines were cloned and characterised, which includes four subtypes of the P1 (adenosine) receptor, seven subtypes of P2X ion channel receptors and 8 subtypes of the P2Y G protein-coupled receptor. Early studies were largely concerned with the physiology, pharmacology and biochemistry of purinergic signalling. More recently, the focus has been on the pathophysiology and therapeutic potential. There was early recognition of the use of P1 receptor agonists for the treatment of supraventricular tachycardia and A2A receptor antagonists are promising for the treatment of Parkinson’s disease. Clopidogrel, a P2Y12 antagonist, is widely used for the treatment of thrombosis and stroke, blocking P2Y12 receptor-mediated platelet aggregation. Diquafosol, a long acting P2Y2 receptor agonist, is being used for the treatment of dry eye. P2X3 receptor antagonists have been developed that are orally bioavailable and stable in vivo and are currently in clinical trials for the treatment of chronic cough, bladder incontinence, visceral pain and hypertension. Antagonists to P2X7 receptors are being investigated for the treatment of inflammatory disorders, including neurodegenerative diseases. Other investigations are in progress for the use of purinergic agents for the treatment of osteoporosis, myocardial infarction, irritable bowel syndrome, epilepsy, atherosclerosis, depression, autism, diabetes, and cancer.

  3. Therapeutics in Huntington's Disease.

    PubMed

    Killoran, Annie; Biglan, Kevin M

    2012-02-08

    OPINION STATEMENT: There is no specific treatment for Huntington's disease (HD). Its many symptoms of motor, psychiatric, and cognitive deterioration are managed with symptomatic relief, rehabilitation, and support. The only drug approved by the US Food and Drug Administration (FDA) for the treatment of HD is an antichoreic agent, tetrabenazine, but this drug is used sparingly because of uneasiness regarding its propensity to cause depression and suicidality in this population, which is already at risk for these complications. Neuroleptics are still first-line treatments for chorea accompanied by comorbid depression and/or behavioral or psychotic symptoms, as is often the case. Psychiatric features, which have a significant impact on a patient's professional and personal life, often become the major focus of management. In addition to neuroleptics, commonly used medications include antidepressants, mood stabilizers, anxiolytics, and psychostimulants. In contrast, few treatment options are available for cognitive impairment in HD; this remains an important and largely unmet therapeutic need. HD patients typically lack insight into their disease manifestations, failing to recognize their need for treatment, and possibly even arguing against it. Multipurpose medications are employed advantageously to simplify the medication regimen, so as to facilitate compliance and not overwhelm the patient. For example, haloperidol can be prescribed for a patient with chorea, agitation, and anorexia, rather than targeting each symptom with a different drug. This approach also limits the potential for adverse effects, which can be difficult to distinguish from the features of the disease itself. With HD's complexity, it is best managed with a multidisciplinary approach that includes a movement disorders specialist, a genetic counselor, a mental health professional, a physical therapist, and a social worker for support and coordination of services. As the disease progresses, there

  4. [Correction of severe alar retraction with alar rotation flap].

    PubMed

    Hong, Chun; Zheng, Dongxue; Lu, Lixin

    2015-01-01

    To investigate the therapeutic effect of alar rotation flap for severe alar retraction. Patients with severely retracted alar underwent ala reconstruction using alar rotation flaps and autogenous cartilage batten grafts. First, costal cartilage was used to reshape the nasal tip and nasal dorsum. Then cartilage patch was used to extend and thicken the retracted alar. Then the alar rotation flap was transferred to correct retracted alar. Fourteen patients with severe alar retraction underwent alar reconstruction with alar rotation flap and alar batten grafts. The alar retraction was corrected in all cases, with improvements functionally and aesthetically. No recurrence of alar retraction was noted. The incision healed with acceptable cosmetic results, with obvious scar in only one patient (one side). The alar rotation flap is an effective and reliable surgical option to correct severe alar retraction. Scar can be kept inconspicuous by precise placement of the incision within the junction of the ala and the nasal dorsum, following principles of the aesthetic nasal subunits.

  5. When not to trust therapeutic drug monitoring

    PubMed Central

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-01-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  6. Potential therapeutic interventions for fragile X syndrome

    PubMed Central

    Levenga, Josien; de Vrij, Femke M.S.; Oostra, Ben A.; Willemsen, Rob

    2010-01-01

    Fragile X syndrome (FXS) is caused by a lack of the fragile X mental retardation protein (FMRP); FMRP deficiency in neurons of patients with FXS causes intellectual disability (IQ<70) and several behavioural problems, including hyperactivity and autistic-like features. In the brain, no gross morphological malformations have been found, although subtle spine abnormalities have been reported. FXS has been linked to altered group I metabotropic glutamate receptor (mGluR)-dependent and independent forms of synaptic plasticity. Here, we discuss potential targeted therapeutic strategies developed to specifically correct disturbances in the excitatory mGluR and the inhibitory gamma-aminobutyric (GABA) receptor pathways that have been tested in animal models and/or in clinical trials with patients with FXS. PMID:20864408

  7. [Lithiasis and ectopic pelvic kidney. Therapeutic aspects].

    PubMed

    Aboutaieb, R; Rabii, R; el Moussaoui, A; Joual, A; Sarf, I; el Mrini, M; Benjelloun, S

    1996-01-01

    Kidney in ectopic position is dysplasic, and associated to other malformations. The advent of a lithiasis in these conditions rises questions about therapeutic options. We report on five observations of pelvic ectopic kidney with urinary lithiasis. Patients were aged from 16 to 42 years. Kidney was non functional in two cases, or with normal appearance sized 10 to 12 cm. We performed total nephrectomy in two cases, pyelolithotomy in the other cases. Surgical approach was subperitoneal via iliac route. A dismembered pyeloplasty was associated in one case. All patients did well. Radiologic control at 6 and 12 months showed no recurrence in a well functioning kidney. Surgical lithotomy is advocated as a treatment in urinary lithiasis affecting ectopic kidney. It is an easy procedure which permits correction of other associated malformations.

  8. BP artificial neural network based wave front correction for sensor-less free space optics communication

    NASA Astrophysics Data System (ADS)

    Li, Zhaokun; Zhao, Xiaohui

    2017-02-01

    The sensor-less adaptive optics (AO) is one of the most promising methods to compensate strong wave front disturbance in free space optics communication (FSO). The back propagation (BP) artificial neural network is applied for the sensor-less AO system to design a distortion correction scheme in this study. This method only needs one or a few online measurements to correct the wave front distortion compared with other model-based approaches, by which the real-time capacity of the system is enhanced and the Strehl Ratio (SR) is largely improved. Necessary comparisons in numerical simulation with other model-based and model-free correction methods proposed in Refs. [6,8,9,10] are given to show the validity and advantage of the proposed method.

  9. Effects of empirical versus model-based reflectance calibration on automated analysis of imaging spectrometer data: a case study from the Drum Mountains, Utah

    USGS Publications Warehouse

    Dwyer, John L.; Kruse, Fred A.; Lefkoff, Adam B.

    1995-01-01

    Data collected by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) have been calibrated to surface reflectance using an empirical method and an atmospheric model-based method. Single spectra extracted from both calibrated data sets for locations with known mineralogy compared favorably with laboratory and field spectral measurements of samples from the same locations. Generally, spectral features were somewhat subdued in data calibrated using the model-based method when compared with those calibrated using the empirical method. Automated feature extraction and expert system analysis techniques have been successfully applied to both data sets to produce similar endmember probability images and spectral endmember libraries. Linear spectral unmixing procedures applied to both calibrated data sets produced similar image maps. These comparisons demonstrated the utility of the model-based approach for atmospherically correcting imaging spectrometer data prior to extraction of scientific information. The results indicated that imaging spectrometer data can be calibrated and analyzed without a priori knowledge of the remote target.

  10. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  11. Model-based feature construction for multivariate decoding

    PubMed Central

    Brodersen, Kay H.; Haiss, Florent; Ong, Cheng Soon; Jung, Fabienne; Tittgemeyer, Marc; Buhmann, Joachim M.; Weber, Bruno; Stephan, Klaas E.

    2011-01-01

    Conventional decoding methods in neuroscience aim to predict discrete brain states from multivariate correlates of neural activity. This approach faces two important challenges. First, a small number of examples are typically represented by a much larger number of features, making it hard to select the few informative features that allow for accurate predictions. Second, accuracy estimates and information maps often remain descriptive and can be hard to interpret. In this paper, we propose a model-based decoding approach that addresses both challenges from a new angle. Our method involves (i) inverting a dynamic causal model of neurophysiological data in a trial-by-trial fashion; (ii) training and testing a discriminative classifier on a strongly reduced feature space derived from trial-wise estimates of the model parameters; and (iii) reconstructing the separating hyperplane. Since the approach is model-based, it provides a principled dimensionality reduction of the feature space; in addition, if the model is neurobiologically plausible, decoding results may offer a mechanistically meaningful interpretation. The proposed method can be used in conjunction with a variety of modelling approaches and brain data, and supports decoding of either trial or subject labels. Moreover, it can supplement evidence-based approaches for model-based decoding and enable structural model selection in cases where Bayesian model selection cannot be applied. Here, we illustrate its application using dynamic causal modelling (DCM) of electrophysiological recordings in rodents. We demonstrate that the approach achieves significant above-chance performance and, at the same time, allows for a neurobiological interpretation of the results. PMID:20406688

  12. The Evolution of Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Riley, Bob; Skalko, Thomas K.

    1998-01-01

    Reviews elements that impact the delivery of therapeutic recreation services, emphasizing elements that are external to the discipline and influence practice and elements that are internal to the discipline and must be addressed if therapeutic recreation is to continue its evolution as a competitive health and human service discipline.…

  13. Language Patterns and Therapeutic Change.

    ERIC Educational Resources Information Center

    Phoenix, Valdemar G.; Lindeman, Mary L.

    Noting that the mental health practitioner needs highly developed linguistic and communicative skills in order to precipitate therapeutic changes, this paper discusses the nature of the contexts of therapeutic interaction. It examines verb tense as a linguistic context marker and shows how various schools of therapy can use it. In addition, it…

  14. Toward Constructing the Therapeutic System.

    ERIC Educational Resources Information Center

    Andolfi, Maurizio; Angelo, Claudio

    1988-01-01

    Describes the therapist as an active participant in the construction of the therapeutic system, explaining how the therapist constructs complex relationships within the evolving therapeutic process. Reevaluates the importance of the individual in the family as an agent of change and as a mediator of triangular relational messages. (Author/NB)

  15. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  16. Purely optical navigation with model-based state prediction

    NASA Astrophysics Data System (ADS)

    Sendobry, Alexander; Graber, Thorsten; Klingauf, Uwe

    2010-10-01

    State-of-the-art Inertial Navigation Systems (INS) based on Micro-Electro-Mechanical Systems (MEMS) have a lack of precision especially in GPS denied environments like urban canyons or in pure indoor missions. The proposed Optical Navigation System (ONS) provides bias free ego-motion estimates using triple redundant sensor information. In combination with a model based state prediction our system is able to estimate velocity, position and attitude of an arbitrary aircraft. Simulating a high performance flow-field estimator the algorithm can compete with conventional low-cost INS. By using measured velocities instead of accelerations the system states drift behavior is not as distinctive as for an INS.

  17. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  18. Stabilization of model-based networked control systems

    SciTech Connect

    Miranda, Francisco; Abreu, Carlos; Mendes, Paulo M.

    2016-06-08

    A class of networked control systems called Model-Based Networked Control Systems (MB-NCSs) is considered. Stabilization of MB-NCSs is studied using feedback controls and simulation of stabilization for different feedbacks is made with the purpose to reduce the network trafic. The feedback control input is applied in a compensated model of the plant that approximates the plant dynamics and stabilizes the plant even under slow network conditions. Conditions for global exponential stabilizability and for the choosing of a feedback control input for a given constant time between the information moments of the network are derived. An optimal control problem to obtain an optimal feedback control is also presented.

  19. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  20. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.