Science.gov

Sample records for model-based therapeutic correction

  1. Correction of placement error in EBL using model based method

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-10-01

    The main source of placement error in maskmaking using electron beam is charging. DISPLACE software provides a method to correct placement errors for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. Unknown physical parameters such as fogging can be found from calibration experiments. A test layout on a single calibration mask was used to calibrate physical parameters used in the correction model. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE, and the mask was fabricated and measured. A good correlation of the measured and predicted values of the correction all over the mask with the complex pattern confirmed the high accuracy of the charging placement error correction.

  2. Model-based correction of the influence of body position on continuous segmental and hand-to-foot bioimpedance measurements.

    PubMed

    Medrano, Guillermo; Eitner, Frank; Walter, Marian; Leonhardt, Steffen

    2010-06-01

    Bioimpedance spectroscopy (BIS) is suitable for continuous monitoring of body water content. The combination of body posture and time is a well-known source of error, which limits the accuracy and therapeutic validity of BIS measurements. This study evaluates a model-based correction as a possible solution. For this purpose, an 11-cylinder model representing body impedance distribution is used. Each cylinder contains a nonlinear two-pool model to describe fluid redistribution due to changing body position and its influence on segmental and hand-to-foot (HF) bioimpedance measurements. A model-based correction of segmental (thigh) and HF measurements (Xitron Hydra 4200) in nine healthy human subjects (following a sequence of 7 min supine, 20 min standing, 40 min supine) has been evaluated. The model-based compensation algorithm represents a compromise between accuracy and simplicity, and reduces the influence of changes in body position on the measured extracellular resistance and extracellular fluid by up to 75 and 70%, respectively.

  3. Model-based motion correction of reduced field of view diffusion MRI data

    NASA Astrophysics Data System (ADS)

    Hering, Jan; Wolf, Ivo; Meinzer, Hans-Peter; Maier-Hein, Klaus H.

    2014-03-01

    In clinical settings, application of the most recent modelling techniques is usually unfeasible due to the limited acquisition time. Localised acquisitions enclosing only the object of interest by reducing the field-of-view (FOV) counteract the time limitation but pose new challenges to the subsequent processing steps like motion correction. We use datasets from the Human Connectome Project (HCP) to simulate head motion distorted reduced FOV acquisitions and present an evaluation of head motion correction approaches: the commonly used affine regis- tration onto an unweighted reference image guided by the mutual information (MI) metric and a model-based approach, which uses reference images computed from approximated tensor data to improve the performance of the MI metric. While the standard approach using the MI metric yields up to 15% outliers (error>5 mm) and a mean spatial error above 1.5 mm, the model-based approach reduces the number of outliers (1%) and the spatial error significantly (p<0.01). The behavior is also reflected by the visual analysis of the MI metric. The evaluation shows that the MI metric is of very limited use for reduced FOV data post-processing. The model-based approach has proven more suitable in this context.

  4. Dynamic aberration correction for conformal optics using model-based wavefront sensorless adaptive optics

    NASA Astrophysics Data System (ADS)

    Han, Xinli; Dong, Bing; Li, Yan; Wang, Rui; Hu, Bin

    2016-10-01

    For missiles and airplanes with high Mach number, traditional spherical or flat window can cause a lot of air drag. Conformal window that follow the general contour of surrounding surface can substantially decrease air drag and extend operational range. However, the local shape of conformal window changes across the Field Of Regard (FOR), leading to time-varying FOR-dependent wavefront aberration and degraded image. So the correction of dynamic aberration is necessary. In this paper, model-based Wavefront Sensorless Adaptive Optics (WSAO) algorithm is investigated both by simulation and experiment for central-obscured pupil. The algorithm is proved to be effective and the correction accuracy of using DM modes is higher than Lukosz modes. For dynamic aberration in our system, the SR can be better than 0.8 when the change of looking angle is less than 2° after t seconds which is the time delay of the control system.

  5. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  6. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  7. An efficient method for transfer cross coefficient approximation in model based optical proximity correction

    NASA Astrophysics Data System (ADS)

    Sabatier, Romuald; Fossati, Caroline; Bourennane, Salah; Di Giacomo, Antonio

    2008-10-01

    Model Based Optical Proximity Correction (MBOPC) is since a decade a widely used technique that permits to achieve resolutions on silicon layout smaller than the wave-length which is used in commercially-available photolithography tools. This is an important point, because masks dimensions are continuously shrinking. As for the current masks, several billions of segments have to be moved, and also, several iterations are needed to reach convergence. Therefore, fast and accurate algorithms are mandatory to perform OPC on a mask in a reasonably short time for industrial purposes. As imaging with an optical lithography system is similar to microscopy, the theory used in MBOPC is drawn from the works originally conducted for the theory of microscopy. Fourier Optics was first developed by Abbe to describe the image formed by a microscope and is often referred to as Abbe formulation. This is one of the best methods for optimizing illumination and is used in most of the commercially available lithography simulation packages. Hopkins method, developed later in 1951, is the best method for mask optimization. Consequently, Hopkins formulation, widely used for partially coherent illumination, and thus for lithography, is present in most of the commercially available OPC tools. This formulation has the advantage of a four-way transmission function independent of the mask layout. The values of this function, called Transfer Cross Coefficients (TCC), describe the illumination and projection pupils. Commonly-used algorithms, involving TCC of Hopkins formulation to compute aerial images during MBOPC treatment, are based on TCC decomposition into its eigenvectors using matricization and the well-known Singular Value Decomposition (SVD) tool. These techniques that use numerical approximation and empirical determination of the number of eigenvectors taken into account, could not match reality and lead to an information loss. They also remain highly runtime consuming. We propose an

  8. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  9. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance.

  10. Shape-dependent dose margin correction using model-based mask data preparation

    NASA Astrophysics Data System (ADS)

    Kimura, Yasuki; Yamamoto, Ryuuji; Kubota, Takao; Kouno, Kenji; Matsushita, Shohei; Hagiwara, Kazuyuki; Hara, Daisuke

    2012-11-01

    Dose Margin has always been known to be a critical factor in mask making. This paper describes why the issue is far more critical than ever before with the 20-nm logic node and beyond using ArF Immersion lithography. Model-Based Mask Data Preparation (MB-MDP) had been presented [references] to show shot count improvements for these complex masks. This paper describes that MBMDP also improves the dose margin. The improvement predicted with theoretical simulation with D2S is confirmed by the results of real mask written by JBX-3200MV (JEOL) by HOYA.

  11. Correcting encoder interpolation error on the Green Bank Telescope using an iterative model based identification algorithm

    NASA Astrophysics Data System (ADS)

    Franke, Timothy; Weadon, Tim; Ford, John; Garcia-Sanz, Mario

    2015-10-01

    Various forms of measurement errors limit telescope tracking performance in practice. A new method for identifying the correcting coefficients for encoder interpolation error is developed. The algorithm corrects the encoder measurement by identifying a harmonic model of the system and using that model to compute the necessary correction parameters. The approach improves upon others by explicitly modeling the unknown dynamics of the structure and controller and by not requiring a separate system identification to be performed. Experience gained from pin-pointing the source of encoder error on the Green Bank Radio Telescope (GBT) is presented. Several tell-tale indicators of encoder error are discussed. Experimental data from the telescope, tested with two different encoders, are presented. Demonstration of the identification methodology on the GBT as well as details of its implementation are discussed. A root mean square tracking error reduction from 0.68 arc seconds to 0.21 arc sec was achieved by changing encoders and was further reduced to 0.10 arc sec with the calibration algorithm. In particular, the ubiquity of this error source is shown and how, by careful correction, it is possible to go beyond the advertised accuracy of an encoder.

  12. Model-based correction of velocity measurements in navigated 3-D ultrasound imaging during neurosurgical interventions.

    PubMed

    Iversen, Daniel Hoyer; Lindseth, Frank; Unsgaard, Geirmund; Torp, Hans; Lovstakken, Lasse

    2013-09-01

    In neurosurgery, information of blood flow is important to identify and avoid damage to important vessels. Three-dimensional intraoperative ultrasound color-Doppler imaging has proven useful in this respect. However, due to Doppler angle-dependencies and the complexity of the vascular architecture, clinical valuable 3-D information of flow direction and velocity is currently not available. In this work, we aim to correct for angle-dependencies in 3-D flow images based on a geometric model of the neurovascular tree generated on-the-fly from free-hand 2-D imaging and an accurate position sensor system. The 3-D vessel model acts as a priori information of vessel orientation used to angle-correct the Doppler measurements, as well as provide an estimate of the average flow direction. Based on the flow direction we were also able to do aliasing correction to approximately double the measurable velocity range. In vitro experiments revealed a high accuracy and robustness for estimating the mean direction of flow. Accurate angle-correction of axial velocities were possible given a sufficient beam-to-flow angle for at least parts of a vessel segment . In vitro experiments showed an absolute relative bias of 9.5% for a challenging low-flow scenario. The method also showed promising results in vivo, improving the depiction of flow in the distal branches of intracranial aneurysms and the feeding arteries of an arteriovenous malformation. Careful inspection by an experienced surgeon confirmed the correct flow direction for all in vivo examples.

  13. Effects of model-based physiological noise correction on default mode network anti-correlations and correlations.

    PubMed

    Chang, Catie; Glover, Gary H

    2009-10-01

    Previous studies have reported that the spontaneous, resting-state time course of the default-mode network is negatively correlated with that of the "task-positive network", a collection of regions commonly recruited in demanding cognitive tasks. However, all studies of negative correlations between the default-mode and task-positive networks have employed some form of normalization or regression of the whole-brain average signal ("global signal"); these processing steps alter the time series of voxels in an uninterpretable manner as well as introduce spurious negative correlations. Thus, the extent of negative correlations with the default mode network without global signal removal has not been well characterized, and it is has recently been hypothesized that the apparent negative correlations in many of the task-positive regions could be artifactually induced by global signal pre-processing. The present study aimed to examine negative and positive correlations with the default-mode network when model-based corrections for respiratory and cardiac noise are applied in lieu of global signal removal. Physiological noise correction consisted of (1) removal of time-locked cardiac and respiratory artifacts using RETROICOR (Glover, G.H., Li, T.Q., Ress, D., 2000. Image-based method for retrospective correction of physiological motion effects in fMRI: RETROICOR. Magn. Reson. Med. 44, 162-167), and (2) removal of low-frequency respiratory and heart rate variations by convolving these waveforms with pre-determined transfer functions (Birn et al., 2008; Chang et al., 2009) and projecting the resulting two signals out of the data. It is demonstrated that negative correlations between the default-mode network and regions of the task-positive network are present in the majority of individual subjects both with and without physiological noise correction. Physiological noise correction increased the spatial extent and magnitude of negative correlations, yielding negative

  14. A model-based scatter artifacts correction for cone beam CT

    PubMed Central

    Zhao, Wei; Vernekohl, Don; Zhu, Jun; Wang, Luyao; Xing, Lei

    2016-01-01

    Purpose: Due to the increased axial coverage of multislice computed tomography (CT) and the introduction of flat detectors, the size of x-ray illumination fields has grown dramatically, causing an increase in scatter radiation. For CT imaging, scatter is a significant issue that introduces shading artifact, streaks, as well as reduced contrast and Hounsfield Units (HU) accuracy. The purpose of this work is to provide a fast and accurate scatter artifacts correction algorithm for cone beam CT (CBCT) imaging. Methods: The method starts with an estimation of coarse scatter profiles for a set of CBCT data in either image domain or projection domain. A denoising algorithm designed specifically for Poisson signals is then applied to derive the final scatter distribution. Qualitative and quantitative evaluations using thorax and abdomen phantoms with Monte Carlo (MC) simulations, experimental Catphan phantom data, and in vivo human data acquired for a clinical image guided radiation therapy were performed. Scatter correction in both projection domain and image domain was conducted and the influences of segmentation method, mismatched attenuation coefficients, and spectrum model as well as parameter selection were also investigated. Results: Results show that the proposed algorithm can significantly reduce scatter artifacts and recover the correct HU in either projection domain or image domain. For the MC thorax phantom study, four-components segmentation yields the best results, while the results of three-components segmentation are still acceptable. The parameters (iteration number K and weight β) affect the accuracy of the scatter correction and the results get improved as K and β increase. It was found that variations in attenuation coefficient accuracies only slightly impact the performance of the proposed processing. For the Catphan phantom data, the mean value over all pixels in the residual image is reduced from −21.8 to −0.2 HU and 0.7 HU for projection

  15. Probabilistic model based error correction in a set of various mutant sequences analyzed by next-generation sequencing.

    PubMed

    Aita, Takuyo; Ichihashi, Norikazu; Yomo, Tetsuya

    2013-12-01

    To analyze the evolutionary dynamics of a mutant population in an evolutionary experiment, it is necessary to sequence a vast number of mutants by high-throughput (next-generation) sequencing technologies, which enable rapid and parallel analysis of multikilobase sequences. However, the observed sequences include many errors of base call. Therefore, if next-generation sequencing is applied to analysis of a heterogeneous population of various mutant sequences, it is necessary to discriminate between true bases as point mutations and errors of base call in the observed sequences, and to subject the sequences to error-correction processes. To address this issue, we have developed a novel method of error correction based on the Potts model and a maximum a posteriori probability (MAP) estimate of its parameters corresponding to the "true sequences". Our method of error correction utilizes (1) the "quality scores" which are assigned to individual bases in the observed sequences and (2) the neighborhood relationship among the observed sequences mapped in sequence space. The computer experiments of error correction of artificially generated sequences supported the effectiveness of our method, showing that 50-90% of errors were removed. Interestingly, this method is analogous to a probabilistic model based method of image restoration developed in the field of information engineering.

  16. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-12-10

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system.

  17. A fully model-based MPC solution including VSB shot dose assignment and shape correction

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Reddy, Murali; Durvasula, Bhardwaj

    2015-10-01

    The value of using multiple dose levels for individual shots on VSB (Variable Shaped Beam) mask writers has been demonstrated earlier [1][2]. The main advantage of modulating dose on a per shot basis is the fact that higher dose levels can be used selectively for critical features while other areas of the mask with non-critical feature types can be exposed at lower dose levels. This reduces the amount of backscattering and mask write time penalty compared to a global overdose-undersize approach. While dose assignment to certain polygons or parts of polygons (VSB shots) can easily be accomplished via DRC rules on layers with limited shape variations like contact or VIA layers, it can be challenging to come up with consistent rules for layers consisting of a very broad range of shapes, generally found on metal layers. This work introduces a method for fully model-based modulation of shot dose for VSB machines supporting between two and eight dose levels and demonstrates results achieved with this method.

  18. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  19. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking

    PubMed Central

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  20. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-11-06

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved.

  1. A model-based correction for outcome reporting bias in meta-analysis.

    PubMed

    Copas, John; Dwan, Kerry; Kirkham, Jamie; Williamson, Paula

    2014-04-01

    It is often suspected (or known) that outcomes published in medical trials are selectively reported. A systematic review for a particular outcome of interest can only include studies where that outcome was reported and so may omit, for example, a study that has considered several outcome measures but only reports those giving significant results. Using the methodology of the Outcome Reporting Bias (ORB) in Trials study of (Kirkham and others, 2010. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. British Medical Journal 340, c365), we suggest a likelihood-based model for estimating the effect of ORB on confidence intervals and p-values in meta-analysis. Correcting for bias has the effect of moving estimated treatment effects toward the null and hence more cautious assessments of significance. The bias can be very substantial, sometimes sufficient to completely overturn previous claims of significance. We re-analyze two contrasting examples, and derive a simple fixed effects approximation that can be used to give an initial estimate of the effect of ORB in practice.

  2. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  3. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  4. Efficient model-based dummy-fill OPC correction flow for deep sub-micron technology nodes

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Salama, Mohamed

    2014-09-01

    Dummy fill insertion is a necessary step in modern semiconductor technologies to achieve homogeneous pattern density per layer. This benefits several fabrication process steps including but not limited to Chemical Mechanical Polishing (CMP), Etching, and Packaging. As the technology keeps shrinking, fill shapes become more challenging to pattern and require aggressive model based optical proximity correction (MBOPC) to achieve better design fidelity. MBOPC on Fill is a challenge to mask data prep runtime and final mask shot count which would affect the total turnaround time (TAT) and mask cost. In our work, we introduce a novel flow that achieves a robust and computationally efficient fill handling methodology during mask data prep, which will keep both the runtime and shot count within their acceptable levels. In this flow, fill shapes undergo a smart MBOPC step which improves the final wafer printing quality and topography uniformity without degrading the final shot count or the OPC cycle runtime. This flow is tested on both front end of line (FEOL) layers and backend of line (BEOL) layers, and results in an improved final printing of the fill patterns while consuming less than 2% of the full MBOPC flow runtime.

  5. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    PubMed Central

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  6. Evaluation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound

    PubMed Central

    Clements, Logan W.; Collins, Jarrod A.; Weis, Jared A.; Simpson, Amber L.; Adams, Lauryn B.; Jarnagin, William R.; Miga, Michael I.

    2016-01-01

    Abstract. Soft-tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface-based metrics, and subsurface validation has largely been performed via phantom experiments. The proposed method involves the analysis of two deformation-correction algorithms for open hepatic image-guided surgery systems via subsurface targets digitized with tracked intraoperative ultrasound (iUS). Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration and for use in retrospective deformation-correction algorithms. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. Mean closest-point distances between the feature contours delineated in the iUS images and corresponding three-dimensional anatomical model generated from preoperative tomograms were computed to quantify the extent to which the deformation-correction algorithms improved registration accuracy. The results for six patients, including eight anatomical targets, indicate that deformation correction can facilitate reduction in target error of ∼52%. PMID:27081664

  7. Model-based mask data preparation (MB-MDP) for ArF and EUV mask process correction

    NASA Astrophysics Data System (ADS)

    Hagiwara, Kazuyuki; Bork, Ingo; Fujimura, Aki

    2011-05-01

    Using Model-Based Mask Data Preparation (MB-MDP) complex masks with complex sub-resolution assist features (SRAFs) can be written in practical write times with today's leading-edge production VSB machines by allowing overlapping VSB shots. This simulation-based approach reduces shot count by taking advantage of the added flexibility in being able to overlap shots. The freedom to overlap shots, it turns out, also increases mask fidelity, CDU on the mask, and CDU on the wafer by writing sub-100nm mask features more accurately, and with better dose margin. This paper describes how overlapping shots enhance mask and wafer quality for various sub-100nm features on ArF masks. In addition, this paper describes how EUV mask accuracy can be enhanced uniquely by allowing overlapping shots.

  8. Better numerical model for shape-dependent dose margin correction using model-based mask data preparation

    NASA Astrophysics Data System (ADS)

    Kimura, Yasuki; Kubota, Takao; Kouno, Kenji; Hagiwara, Kazuyuki; Matsushita, Shohei; Hara, Daisuke

    2013-06-01

    For the mask making community, maintaining acceptable dose margin has been recognized as a critical factor in the mask-making process. This is expected to be more critical for 20nm logic node masks and beyond. To deal with this issue, model-based mask data preparation (MB-MDP) had been presented as a useful method to obtain sufficient dose margin for these complex masks, in addition to reducing shot count. When the MB-MDP approach is applied in the actual mask production, the prediction of the dose margin and the CD in the finished mask is essential. This paper describes an improved model of mask process which predicts dose margin and CD in finished masks better compared with the single Gaussian model presented in previous work. The better predictions of this simple numerical model are confirmed with simulation by D2S and actual mask written by HOYA using JEOL JBX-3200MV.

  9. SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data

    SciTech Connect

    Chen, M; Jiang, S; Lu, W

    2015-06-15

    Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data, as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.

  10. Therapeutic strategies to correct proteostasis-imbalance in chronic obstructive lung diseases.

    PubMed

    Bodas, M; Tran, I; Vij, N

    2012-08-01

    Proteostasis is a critical cellular homeostasis mechanism that regulates the concentration of all cellular proteins by controlling protein- synthesis, processing and degradation. This includes protein-conformation, binding interactions and sub-cellular localization. Environmental, genetic or age-related pathogenetic factors can modulate the proteostasis (proteostasis-imbalance) through transcriptional, translational and post-translational changes that trigger the development of several complex diseases. Although these factors are known to be involved in pathogenesis of chronic obstructive pulmonary disease (COPD), the role of proteostasis mechanisms in COPD is scarcely investigated. As a proof of concept, our recent data reveals a novel role of proteostasis-imbalance in COPD pathogenesis. Briefly, cigarette- and biomass- smoke induced proteostasis-imbalance may aggravate chronic inflammatory-oxidative stress and/or protease-anti-protease imbalance resulting in pathogenesis of severe emphysema. In contrast, pathogenesis of other chronic lung diseases like ΔF508-cystic fibrosis (CF), α1-anti-trypsin-deficiency (α-1 ATD) and pulmonary fibrosis (PF) is regulated by other proteostatic mechanisms, involving the degradation of misfolded proteins (ΔF508-CFTR/α1-AT- Z variant) or regulating the concentration of signaling proteins (such as TGF-β1) by the ubiquitin-proteasome system (UPS). The therapeutic strategies to correct proteostasis-imbalance in misfolded protein disorders such as ΔF508-CF have been relatively well studied and involve strategies that rescue functional CFTR protein to treat the underlying cause of the disease. While in the case of COPD-emphysema and/or PF, identification of novel proteostasis-regulators that can control inflammatory-oxidative stress and/or protease-anti-protease balance is warranted.

  11. Model-based correction of tissue compression for tracked ultrasound in soft tissue image-guided surgery.

    PubMed

    Pheiffer, Thomas S; Thompson, Reid C; Rucker, Daniel C; Simpson, Amber L; Miga, Michael I

    2014-04-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm.

  12. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  13. Kidney Stone Volume Estimation from Computerized Tomography Images Using a Model Based Method of Correcting for the Point Spread Function

    PubMed Central

    Duan, Xinhui; Wang, Jia; Qu, Mingliang; Leng, Shuai; Liu, Yu; Krambeck, Amy; McCollough, Cynthia

    2014-01-01

    Purpose We propose a method to improve the accuracy of volume estimation of kidney stones from computerized tomography images. Materials and Methods The proposed method consisted of 2 steps. A threshold equal to the average of the computerized tomography number of the object and the background was first applied to determine full width at half maximum volume. Correction factors were then applied, which were precalculated based on a model of a sphere and a 3-dimensional Gaussian point spread function. The point spread function was measured in a computerized tomography scanner to represent the response of the scanner to a point-like object. Method accuracy was validated using 6 small cylindrical phantoms with 2 volumes of 21.87 and 99.9 mm3, and 3 attenuations, respectively, and 76 kidney stones with a volume range of 6.3 to 317.4 mm3. Volumes estimated by the proposed method were compared with full width at half maximum volumes. Results The proposed method was significantly more accurate than full width at half maximum volume (p <0.0001). The magnitude of improvement depended on stone volume with smaller stones benefiting more from the method. For kidney stones 10 to 20 mm3 in volume the average improvement in accuracy was the greatest at 19.6%. Conclusions The proposed method achieved significantly improved accuracy compared with threshold methods. This may lead to more accurate stone management. PMID:22819107

  14. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in the NOTCH3 gene. NOTCH3 mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specific NOTCH3 exons. Selection of these exons was achieved using in silico studies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targeted NOTCH3 exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations

  15. C60 Fullerene as Promising Therapeutic Agent for the Prevention and Correction of Skeletal Muscle Functioning at Ischemic Injury

    NASA Astrophysics Data System (ADS)

    Nozdrenko, D. M.; Zavodovskyi, D. O.; Matvienko, T. Yu.; Zay, S. Yu.; Bogutska, K. I.; Prylutskyy, Yu. I.; Ritter, U.; Scharff, P.

    2017-02-01

    The therapeutic effect of pristine C60 fullerene aqueous colloid solution (C60FAS) on the functioning of the rat soleus muscle at ischemic injury depending on the time of the general pathogenesis of muscular system and method of administration C60FAS in vivo was investigated. It was found that intravenous administration of C60FAS is the optimal for correction of speed macroparameters of contraction for ischemic muscle damage. At the same time, intramuscular administration of C60FAS shows pronounced protective effect in movements associated with the generation of maximum force responses or prolonged contractions, which increase the muscle fatigue level. Analysis of content concentration of creatine phosphokinase and lactate dehydrogenase enzymes in the blood of experimental animals indicates directly that C60FAS may be a promising therapeutic agent for the prevention and correction of ischemic-damaged skeletal muscle function.

  16. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    SciTech Connect

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for building parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the

  17. [Beat therapeutic inertia in dyslipidemic patient management: A challenge in daily clinical practice] [corrected].

    PubMed

    Morales, Clotilde; Mauri, Marta; Vila, Lluís

    2014-01-01

    Beat therapeutic inertia in dyslipidemic patient management: a challenge in daily clinical practice. In patients with dyslipidemia, there is the need to reach the therapeutic goals in order to get the maximum benefit in the cardiovascular events risk reduction, especially myocardial infarction. Even having guidelines and some powerful hypolipidemic drugs, the goals of low-density lipoprotein-cholesterol (LDL-c) are often not reached, being of special in patients with a high cardiovascular risk. One of the causes is the therapeutic inertia. There are tools to plan the treatment and make the decisions easier. One of the challenges in everyday clinical practice is to know the needed percentage of reduction in LDL-c. Moreover: it is hard to know which one is the treatment we should use in the beginning of the treatment but also when the desired objective is not reached. This article proposes a practical method that can help solving these questions.

  18. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera

    NASA Astrophysics Data System (ADS)

    Holstensson, M.; Erlandsson, K.; Poludniowski, G.; Ben-Haim, S.; Hutton, B. F.

    2015-04-01

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of 99mTc and 123I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of 99mTc and 123I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both 99mTc and 123I, results show that the estimated spatial distribution of events from 99mTc in the 99mTc photopeak energy window is very similar to that measured in a single 99mTc phantom study. The extracted images of primary events display increased cold lesion contrasts for both 99mTc and 123I.

  19. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera.

    PubMed

    Holstensson, M; Erlandsson, K; Poludniowski, G; Ben-Haim, S; Hutton, B F

    2015-04-21

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of (99m)Tc and (123)I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of (99m)Tc and (123)I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both (99m)Tc and (123)I, results show that the estimated spatial distribution of events from (99m)Tc in the (99m)Tc photopeak energy window is very similar to that measured in a single (99m)Tc phantom study. The extracted images of primary events display increased cold lesion contrasts for both (99m)Tc and (123)I.

  20. Corrections.

    PubMed

    2015-07-01

    Lai Y-S, Biedermann P, Ekpo UF, et al. Spatial distribution of schistosomiasis and treatment needs in sub-Saharan Africa: a systematic review and geostatistical analysis. Lancet Infect Dis 2015; published online May 22. http://dx.doi.org/10.1016/S1473-3099(15)00066-3—Figure 1 of this Article should have contained a box stating ‘100 references added’ with an arrow pointing inwards, rather than a box stating ‘199 records excluded’, and an asterisk should have been added after ‘1473 records extracted into GNTD’. Additionally, the positioning of the ‘§ and ‘†’ footnotes has been corrected in table 1. These corrections have been made to the online version as of June 4, 2015.

  1. Correction.

    PubMed

    2016-02-01

    In the article by Guessous et al (Guessous I, Pruijm M, Ponte B, Ackermann D, Ehret G, Ansermot N, Vuistiner P, Staessen J, Gu Y, Paccaud F, Mohaupt M, Vogt B, Pechère-Bertschi A, Martin PY, Burnier M, Eap CB, Bochud M. Associations of ambulatory blood pressure with urinary caffeine and caffeine metabolite excretions. Hypertension. 2015;65:691–696. doi: 10.1161/HYPERTENSIONAHA.114.04512), which published online ahead of print December 8, 2014, and appeared in the March 2015 issue of the journal, a correction was needed.One of the author surnames was misspelled. Antoinette Pechère-Berstchi has been corrected to read Antoinette Pechère-Bertschi.The authors apologize for this error.

  2. Correction

    NASA Astrophysics Data System (ADS)

    1998-12-01

    Alleged mosasaur bite marks on Late Cretaceous ammonites are limpet (patellogastropod) home scars Geology, v. 26, p. 947 950 (October 1998) This article had the following printing errors: p. 947, Abstract, line 11, “sepia” should be “septa” p. 947, 1st paragraph under Introduction, line 2, “creep” should be “deep” p. 948, column 1, 2nd paragraph, line 7, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 1, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 5, “19774” should be “1977)” p. 949, column 1, 4th paragraph, line 7, “in particular” should be “In particular” CORRECTION Mammalian community response to the latest Paleocene thermal maximum: An isotaphonomic study in the northern Bighorn Basin, Wyoming Geology, v. 26, p. 1011 1014 (November 1998) An error appeared in the References Cited. The correct reference appears below: Fricke, H. C., Clyde, W. C., O'Neil, J. R., and Gingerich, P. D., 1998, Evidence for rapid climate change in North America during the latest Paleocene thermal maximum: Oxygen isotope compositions of biogenic phosphate from the Bighorn Basin (Wyoming): Earth and Planetary Science Letters, v. 160, p. 193 208.

  3. Gene transfer corrects acute GM2 gangliosidosis--potential therapeutic contribution of perivascular enzyme flow.

    PubMed

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-08-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay-Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity-as opposed to tremor-ataxia-were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue-long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system.

  4. Tafenoquine at therapeutic concentrations does not prolong fridericia-corrected QT interval in healthy subjects

    PubMed Central

    Green, Justin A; Patel, Apurva K; Patel, Bela R; Hussaini, Azra; Harrell, Emma J; McDonald, Mirna J; Carter, Nick; Mohamed, Khadeeja; Duparc, Stephan; Miller, Ann K

    2014-01-01

    Tafenoquine is being developed for relapse prevention in Plasmodium vivax malaria. This Phase I, single-blind, randomized, placebo- and active-controlled parallel group study investigated whether tafenoquine at supratherapeutic and therapeutic concentrations prolonged cardiac repolarization in healthy volunteers. Subjects aged 18–65 years were randomized to one of five treatment groups (n = 52 per group) to receive placebo, tafenoquine 300, 600, or 1200 mg, or moxifloxacin 400 mg (positive control). Lack of effect was demonstrated if the upper 90% CI of the change from baseline in QTcF following supratherapeutic tafenoquine 1200 mg versus placebo (ΔΔQTcF) was <10 milliseconds for all pre-defined time points. The maximum ΔΔQTcF with tafenoquine 1200 mg (n = 50) was 6.39 milliseconds (90% CI 2.85, 9.94) at 72 hours post-final dose; that is, lack of effect for prolongation of cardiac depolarization was demonstrated. Tafenoquine 300 mg (n = 48) or 600 mg (n = 52) had no effect on ΔΔQTcF. Pharmacokinetic/pharmacodynamic modeling of the tafenoquine–QTcF concentration–effect relationship demonstrated a shallow slope (0.5 ms/μg mL–1) over a wide concentration range. For moxifloxacin (n = 51), maximum ΔΔQTcF was 8.52 milliseconds (90% CI 5.00, 12.04), demonstrating assay sensitivity. In this thorough QT/QTc study, tafenoquine did not have a clinically meaningful effect on cardiac repolarization. PMID:24700490

  5. Travel cost demand model based river recreation benefit estimates with on-site and household surveys: Comparative results and a correction procedure

    NASA Astrophysics Data System (ADS)

    Loomis, John

    2003-04-01

    Past recreation studies have noted that on-site or visitor intercept surveys are subject to over-sampling of avid users (i.e., endogenous stratification) and have offered econometric solutions to correct for this. However, past papers do not estimate the empirical magnitude of the bias in benefit estimates with a real data set, nor do they compare the corrected estimates to benefit estimates derived from a population sample. This paper empirically examines the magnitude of the recreation benefits per trip bias by comparing estimates from an on-site river visitor intercept survey to a household survey. The difference in average benefits is quite large, with the on-site visitor survey yielding 24 per day trip, while the household survey yields 9.67 per day trip. A simple econometric correction for endogenous stratification in our count data model lowers the benefit estimate to $9.60 per day trip, a mean value nearly identical and not statistically different from the household survey estimate.

  6. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization

    PubMed Central

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F.; Pal, Saikat; McWalter, Emily J.; Beaupré, Gary S.; Maier, Andreas

    2013-01-01

    Purpose: Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. Methods: An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers’ location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4–12) and

  7. Winner's Curse Correction and Variable Thresholding Improve Performance of Polygenic Risk Modeling Based on Genome-Wide Association Study Summary-Level Data

    PubMed Central

    Shi, Jianxin; Duan, Jubao; Berndt, Sonja T.; Moy, Winton; Yu, Kai; Song, Lei; Wheeler, William; Hua, Xing; Silverman, Debra; Garcia-Closas, Montserrat; Hsiung, Chao Agnes; Figueroa, Jonine D.; Cortessis, Victoria K.; Malats, Núria; Karagas, Margaret R.; Vineis, Paolo; Chang, I-Shou; Lin, Dongxin; Zhou, Baosen; Seow, Adeline; Hong, Yun-Chul; Caporaso, Neil E.; Wolpin, Brian; Jacobs, Eric; Petersen, Gloria M.; Klein, Alison P.; Li, Donghui; Risch, Harvey; Sanders, Alan R.; Hsu, Li; Schoen, Robert E.; Brenner, Hermann; Stolzenberg-Solomon, Rachael; Gejman, Pablo; Lan, Qing; Rothman, Nathaniel; Amundadottir, Laufey T.; Landi, Maria Teresa; Levinson, Douglas F.; Chanock, Stephen J.; Chatterjee, Nilanjan

    2016-01-01

    Recent heritability analyses have indicated that genome-wide association studies (GWAS) have the potential to improve genetic risk prediction for complex diseases based on polygenic risk score (PRS), a simple modelling technique that can be implemented using summary-level data from the discovery samples. We herein propose modifications to improve the performance of PRS. We introduce threshold-dependent winner’s-curse adjustments for marginal association coefficients that are used to weight the single-nucleotide polymorphisms (SNPs) in PRS. Further, as a way to incorporate external functional/annotation knowledge that could identify subsets of SNPs highly enriched for associations, we propose variable thresholds for SNPs selection. We applied our methods to GWAS summary-level data of 14 complex diseases. Across all diseases, a simple winner’s curse correction uniformly led to enhancement of performance of the models, whereas incorporation of functional SNPs was beneficial only for selected diseases. Compared to the standard PRS algorithm, the proposed methods in combination led to notable gain in efficiency (25–50% increase in the prediction R2) for 5 of 14 diseases. As an example, for GWAS of type 2 diabetes, winner’s curse correction improved prediction R2 from 2.29% based on the standard PRS to 3.10% (P = 0.0017) and incorporating functional annotation data further improved R2 to 3.53% (P = 2×10−5). Our simulation studies illustrate why differential treatment of certain categories of functional SNPs, even when shown to be highly enriched for GWAS-heritability, does not lead to proportionate improvement in genetic risk-prediction because of non-uniform linkage disequilibrium structure. PMID:28036406

  8. Winner's Curse Correction and Variable Thresholding Improve Performance of Polygenic Risk Modeling Based on Genome-Wide Association Study Summary-Level Data.

    PubMed

    Shi, Jianxin; Park, Ju-Hyun; Duan, Jubao; Berndt, Sonja T; Moy, Winton; Yu, Kai; Song, Lei; Wheeler, William; Hua, Xing; Silverman, Debra; Garcia-Closas, Montserrat; Hsiung, Chao Agnes; Figueroa, Jonine D; Cortessis, Victoria K; Malats, Núria; Karagas, Margaret R; Vineis, Paolo; Chang, I-Shou; Lin, Dongxin; Zhou, Baosen; Seow, Adeline; Matsuo, Keitaro; Hong, Yun-Chul; Caporaso, Neil E; Wolpin, Brian; Jacobs, Eric; Petersen, Gloria M; Klein, Alison P; Li, Donghui; Risch, Harvey; Sanders, Alan R; Hsu, Li; Schoen, Robert E; Brenner, Hermann; Stolzenberg-Solomon, Rachael; Gejman, Pablo; Lan, Qing; Rothman, Nathaniel; Amundadottir, Laufey T; Landi, Maria Teresa; Levinson, Douglas F; Chanock, Stephen J; Chatterjee, Nilanjan

    2016-12-01

    Recent heritability analyses have indicated that genome-wide association studies (GWAS) have the potential to improve genetic risk prediction for complex diseases based on polygenic risk score (PRS), a simple modelling technique that can be implemented using summary-level data from the discovery samples. We herein propose modifications to improve the performance of PRS. We introduce threshold-dependent winner's-curse adjustments for marginal association coefficients that are used to weight the single-nucleotide polymorphisms (SNPs) in PRS. Further, as a way to incorporate external functional/annotation knowledge that could identify subsets of SNPs highly enriched for associations, we propose variable thresholds for SNPs selection. We applied our methods to GWAS summary-level data of 14 complex diseases. Across all diseases, a simple winner's curse correction uniformly led to enhancement of performance of the models, whereas incorporation of functional SNPs was beneficial only for selected diseases. Compared to the standard PRS algorithm, the proposed methods in combination led to notable gain in efficiency (25-50% increase in the prediction R2) for 5 of 14 diseases. As an example, for GWAS of type 2 diabetes, winner's curse correction improved prediction R2 from 2.29% based on the standard PRS to 3.10% (P = 0.0017) and incorporating functional annotation data further improved R2 to 3.53% (P = 2×10-5). Our simulation studies illustrate why differential treatment of certain categories of functional SNPs, even when shown to be highly enriched for GWAS-heritability, does not lead to proportionate improvement in genetic risk-prediction because of non-uniform linkage disequilibrium structure.

  9. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2013-10-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose (18F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in most

  10. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  11. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC

    PubMed Central

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  12. Predictive factors for obtaining a correct therapeutic range using antivitamin K anticoagulants: a tertiary center experience of patient adherence to anticoagulant therapy

    PubMed Central

    Jurcuţ, Ruxandra; Militaru, Sebastian; Geavlete, Oliviana; Drăgotoiu, Nic; Sipoş, Sergiu; Roşulescu, Răzvan; Ginghină, Carmen; Jurcuţ, Ciprian

    2015-01-01

    Background Patient adherence is an essential factor in obtaining efficient oral anticoagulation using vitamin K antagonists (VKAs), a situation with a narrow therapeutic window. Therefore, patient education and awareness are crucial for good management. Auditing the current situation would help to identify the magnitude of the problem and to build tailored education programs for these patients. Methods This study included 68 hospitalized chronically anticoagulated patients (mean age 62.6±13.1 years; males, 46%) who responded to a 26-item questionnaire to assess their knowledge on VKA therapy management. Laboratory and clinical data were used to determine the international normalized ratio (INR) at admission, as well as to calculate CHA2DS2-VASC and HAS-BLED scores for patients with atrial fibrillation. Results The majority of patients (62%) were receiving VKA for atrial fibrillation, the others for a mechanical prosthesis and previous thromboembolic disease or stroke. In the atrial fibrillation group, the mean CHA2DS2-VASC score was 3.1±1.5, while the average HAS-BLED score was 1.8±1.2. More than half of the patients (53%) had an INR outside of the therapeutic range at admission, with the majority (43%) having a low INR. A correct INR value was predicted by education level (higher education) and the diagnostic indication (patients with mechanical prosthesis being best managed). Patients presenting with a therapeutic INR had a trend toward longer treatment duration than those outside the therapeutic range (62±72 months versus 36±35 months, respectively, P=0.06). There was no correlation between INR at admission and the patient’s living conditions, INR monitoring frequency, and bleeding history. Conclusion In a tertiary cardiology center, more than half of patients receiving VKAs are admitted with an INR falling outside the therapeutic range, irrespective of the bleeding or embolic risk. Patients with a mechanical prosthesis and complex antithrombotic regimens

  13. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics

    PubMed Central

    Chung, W. Joon; Goeckeler-Fried, Jennifer L.; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M.; Plyler, Zackery E.; Hong, Jeong S.; Mazur, Marina; Piazza, Gary A.; Keeton, Adam B.; White, E. Lucile; Rasmussen, Lynn; Weissman, Allan M.; Denny, R. Aldrin; Brodsky, Jeffrey L.; Sorscher, Eric J.

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of “repairable” F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident “Band B” F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41—a compound that inhibits the E1 ubiquitin activating enzyme—was shown to synergistically enhance F508del rescue by C

  14. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics.

    PubMed

    Chung, W Joon; Goeckeler-Fried, Jennifer L; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M; Plyler, Zackery E; Hong, Jeong S; Mazur, Marina; Piazza, Gary A; Keeton, Adam B; White, E Lucile; Rasmussen, Lynn; Weissman, Allan M; Denny, R Aldrin; Brodsky, Jeffrey L; Sorscher, Eric J

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of "repairable" F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident "Band B" F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41-a compound that inhibits the E1 ubiquitin activating enzyme-was shown to synergistically enhance F508del rescue by C18, a small

  15. Disseminated oligodendroglial-like leptomeningeal tumors: preliminary diagnostic and therapeutic results for a novel tumor entity [corrected].

    PubMed

    Preuss, Matthias; Christiansen, Holger; Merkenschlager, Andreas; Hirsch, Franz Wolfgang; Kiess, Wieland; Müller, Wolf; Kästner, Stefanie; Henssler, Andreas; Pekrun, Arnulf; Hauch, Holger; Nathrath, Michaela; Meixensberger, Jürgen; Pietsch, Torsten; Kuchelmeister, Klaus

    2015-08-01

    Pediatric tumors of the central nervous system composed of oligoid tumor cells showing diffuse leptomeningeal spread without a primary mass lesion seem to represent a novel tumor entity. The terms "diffuse leptomeningeal glioneural tumor" or-preferably-"disseminated oligodendroglial-like leptomeningeal tumor of childhood" (DOGLT) were proposed. Four patients were identified with clinico-neuropathologic findings compatible with DOGLT and a mean follow-up time of 54 months was determined. Seven different biopsies obtained from the four patients were histologically evaluated. Clinical course, diagnostic measures, histopathologic and radiologic features and treatment suggestions were recorded, on the basis of which diagnostic and therapeutic algorithm was proposed. Patients with DOGLT presented with hydrocephalus as first symptom, requiring neurosurgical therapy. Open arachnoid biopsy was necessary to confirm diagnosis. The oligoid cells in a desmoplastic or focally myxoid matrix showed OLIG2-, MAP2-, S-100 and rare HuC/HuD protein-immunopositivity. IDH1 (R132H)- and CD99-immunohistochemistry was negative in all patients. None of the evaluable biopsies of three patients showed chromosome 1p/19q deletion, neither as isolated nor combined allelic loss. Chemotherapy according to the SIOP-LGG 2004 standard induction and consolidation protocol resulted in complete response and partial response, respectively, in 50 % of the patients. However, after discontinuation of chemotherapy, two patients experienced tumor progression and one of them succumbed to the disease after 19 months. Radiological criteria as well as preliminary treatment results are presented after observation of four clinical cases. Prognosis and long-term clinical courses remain to be observed.

  16. CORRECTED ERROR VIDEO VERSUS A PHYSICAL THERAPIST INSTRUCTED HOME EXERCISE PROGRAM: ACCURACY OF PERFORMING THERAPEUTIC SHOULDER EXERCISES

    PubMed Central

    Krishnamurthy, Kamesh; Hopp, Jennifer; Stanley, Laura; Spores, Ken; Braunreiter, David

    2016-01-01

    Background and Purpose The accurate performance of physical therapy exercises can be difficult. In this evolving healthcare climate it is important to continually look for better methods to educate patients. The use of handouts, in-person demonstration, and video instruction are all potential avenues used to teach proper exercise form. The purpose of this study was to examine if a corrected error video (CEV) would be as effective as a single visit with a physical therapist (PT) to teach healthy subjects how to properly perform four different shoulder rehabilitation exercises. Study Design This was a prospective, single-blinded interventional trial. Methods Fifty-eight subjects with no shoulder complaints were recruited from two institutions and randomized into one of two groups: the CEV group (30 subjects) was given a CEV comprised of four shoulder exercises, while the physical therapy group (28 subjects) had one session with a PT as well as a handout of how to complete the exercises. Each subject practiced the exercises for one week and was then videotaped performing them during a return visit. Videos were scored with the shoulder exam assessment tool (SEAT) created by the authors. Results There was no difference between the groups on total SEAT score (13.66 ± 0.29 vs 13.46 ± 0.30 for CEV vs PT, p = 0.64, 95% CI [−0.06, 0.037]). Average scores for individual exercises also showed no significant difference. Conclusion/Clinical Relevance These results demonstrate that the inexpensive and accessible CEV is as beneficial as direct instruction in teaching subjects to properly perform shoulder rehabilitation exercises. Level of Evidence 1b PMID:27757288

  17. Correction of Murine Rag2 Severe Combined Immunodeficiency by Lentiviral Gene Therapy Using a Codon-optimized RAG2 Therapeutic Transgene

    PubMed Central

    van Til, Niek P; de Boer, Helen; Mashamba, Nomusa; Wabik, Agnieszka; Huston, Marshall; Visser, Trudi P; Fontana, Elena; Poliani, Pietro Luigi; Cassani, Barbara; Zhang, Fang; Thrasher, Adrian J; Villa, Anna; Wagemaker, Gerard

    2012-01-01

    Recombination activating gene 2 (RAG2) deficiency results in severe combined immunodeficiency (SCID) with complete lack of T and B lymphocytes. Initial gammaretroviral gene therapy trials for other types of SCID proved effective, but also revealed the necessity of safe vector design. We report the development of lentiviral vectors with the spleen focus forming virus (SF) promoter driving codon-optimized human RAG2 (RAG2co), which improved phenotype amelioration compared to native RAG2 in Rag2−/− mice. With the RAG2co therapeutic transgene, T-cell receptor (TCR) and immunoglobulin repertoire, T-cell mitogen responses, plasma immunoglobulin levels and T-cell dependent and independent specific antibody responses were restored. However, the thymus double positive T-cell population remained subnormal, possibly due to the SF virus derived element being sensitive to methylation/silencing in the thymus, which was prevented by replacing the SF promoter by the previously reported silencing resistant element (ubiquitous chromatin opening element (UCOE)), and also improved B-cell reconstitution to eventually near normal levels. Weak cellular promoters were effective in T-cell reconstitution, but deficient in B-cell reconstitution. We conclude that immune functions are corrected in Rag2−/− mice by genetic modification of stem cells using the UCOE driven codon-optimized RAG2, providing a valid optional vector for clinical implementation. PMID:22692499

  18. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  19. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  20. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  1. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  2. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  3. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  4. Corrective work.

    ERIC Educational Resources Information Center

    Hill, Leslie A.

    1978-01-01

    Discusses some general principles for planning corrective instruction and exercises in English as a second language, and follows with examples from the areas of phonemics, phonology, lexicon, idioms, morphology, and syntax. (IFS/WGA)

  5. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  6. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  7. Model-based machine learning

    PubMed Central

    Bishop, Christopher M.

    2013-01-01

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  8. Therapeutic Recreation

    ERIC Educational Resources Information Center

    Parks and Recreation, 1971

    1971-01-01

    Graphic profiles of (1) the professional membership of the National Therapeutic Recreation Society, (2) state-level employment opportunities in the field, and (3) educational opportunities at U.S. colleges and universities. (MB)

  9. The Challenge of Configuring Model-Based Space Mission Planners

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy D.; Clement, Bradley J.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    Mission planning is central to space mission operations, and has benefited from advances in model-based planning software. Constraints arise from many sources, including simulators and engineering specification documents, and ensuring that constraints are correctly represented in the planner is a challenge. As mission constraints evolve, planning domain modelers need help with modeling constraints efficiently using the available source data, catching errors quickly, and correcting the model. This paper describes the current state of the practice in designing model-based mission planning tools, the challenges facing model developers, and a proposed Interactive Model Development Environment (IMDE) to configure mission planning systems. We describe current and future technology developments that can be integrated into an IMDE.

  10. A tool for model based diagnostics of the AGS Booster

    SciTech Connect

    Luccio, A.

    1993-12-31

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation.

  11. Jitter Correction

    NASA Technical Reports Server (NTRS)

    Waegell, Mordecai J.; Palacios, David M.

    2011-01-01

    Jitter_Correct.m is a MATLAB function that automatically measures and corrects inter-frame jitter in an image sequence to a user-specified precision. In addition, the algorithm dynamically adjusts the image sample size to increase the accuracy of the measurement. The Jitter_Correct.m function takes an image sequence with unknown frame-to-frame jitter and computes the translations of each frame (column and row, in pixels) relative to a chosen reference frame with sub-pixel accuracy. The translations are measured using a Cross Correlation Fourier transformation method in which the relative phase of the two transformed images is fit to a plane. The measured translations are then used to correct the inter-frame jitter of the image sequence. The function also dynamically expands the image sample size over which the cross-correlation is measured to increase the accuracy of the measurement. This increases the robustness of the measurement to variable magnitudes of inter-frame jitter

  12. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  13. MACROMOLECULAR THERAPEUTICS

    PubMed Central

    Yang, Jiyuan; Kopeček, Jindřich

    2014-01-01

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines – (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. PMID:24747162

  14. Macromolecular therapeutics.

    PubMed

    Yang, Jiyuan; Kopeček, Jindřich

    2014-09-28

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines - (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated.

  15. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen C.; Ruegsegger, Mark; Barnes, Philip D.; Smith, Bryan R.; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'etre of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multi-step work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  16. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen; Ruegsegger, Mark; Barnes, Philip; Smith, Bryan; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'être of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multistep work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self-assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  17. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  18. Therapeutic proteins.

    PubMed

    Dimitrov, Dimiter S

    2012-01-01

    Protein-based therapeutics are highly successful in clinic and currently enjoy unprecedented recognition of their potential. More than 100 genuine and similar number of modified therapeutic proteins are approved for clinical use in the European Union and the USA with 2010 sales of US$108 bln; monoclonal antibodies (mAbs) accounted for almost half (48%) of the sales. Based on their pharmacological activity, they can be divided into five groups: (a) replacing a protein that is deficient or abnormal; (b) augmenting an existing pathway; (c) providing a novel function or activity; (d) interfering with a molecule or organism; and (e) delivering other compounds or proteins, such as a radionuclide, cytotoxic drug, or effector proteins. Therapeutic proteins can also be grouped based on their molecular types that include antibody-based drugs, Fc fusion proteins, anticoagulants, blood factors, bone morphogenetic proteins, engineered protein scaffolds, enzymes, growth factors, hormones, interferons, interleukins, and thrombolytics. They can also be classified based on their molecular mechanism of activity as (a) binding non-covalently to target, e.g., mAbs; (b) affecting covalent bonds, e.g., enzymes; and (c) exerting activity without specific interactions, e.g., serum albumin. Most protein therapeutics currently on the market are recombinant and hundreds of them are in clinical trials for therapy of cancers, immune disorders, infections, and other diseases. New engineered proteins, including bispecific mAbs and multispecific fusion proteins, mAbs conjugated with small molecule drugs, and proteins with optimized pharmacokinetics, are currently under development. However, in the last several decades, there are no conceptually new methodological developments comparable, e.g., to genetic engineering leading to the development of recombinant therapeutic proteins. It appears that a paradigm change in methodologies and understanding of mechanisms is needed to overcome major

  19. Platelet-delivered therapeutics.

    PubMed

    Lyde, R; Sabatino, D; Sullivan, S K; Poncz, M

    2015-06-01

    We have proposed that modified platelets could potentially be used to correct intrinsic platelet defects as well as for targeted delivery of therapeutic molecules to sights of vascular injury. Ectopic expression of proteins within α-granules prior to platelet activation has been achieved for several proteins, including urokinase, factor (F) VIII, and partially for FIX. Potential uses of platelet-directed therapeutics will be discussed, focusing on targeted delivery of urokinase as a thromboprophylactic agent and FVIII for the treatment of hemophilia A patients with intractable inhibitors. This presentation will discuss new strategies that may be useful in the care of patients with vascular injury as well as remaining challenges and limitations of these approaches.

  20. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  1. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch.

  2. 77 FR 72199 - Technical Corrections; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... COMMISSION 10 CFR Part 171 RIN 3150-AJ16 Technical Corrections; Correction AGENCY: Nuclear Regulatory... corrections, including updating the street address for the Region I office, correcting authority citations and... rule. DATES: The correction is effective on December 5, 2012. FOR FURTHER INFORMATION CONTACT:...

  3. Model-based fault detection and diagnosis in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Carrasco, Rodrigo A.

    2016-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) observatory, with its 66 individual telescopes and other central equipment, generates a massive set of monitoring data every day, collecting information on the performance of a variety of critical and complex electrical, electronic and mechanical components. This data is crucial for most troubleshooting efforts performed by engineering teams. More than 5 years of accumulated data and expertise allow for a more systematic approach to fault detection and diagnosis. This paper presents model-based fault detection and diagnosis techniques to support corrective and predictive maintenance in a 24/7 minimum-downtime observatory.

  4. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ..., 50, 52, and 70 RIN 3150-AJ23 Miscellaneous Corrections; Corrections AGENCY: Nuclear Regulatory... final rule in the Federal Register on June 7, 2013, to make miscellaneous corrections to its regulations... miscellaneous corrections to its regulations in chapter I of Title 10 of the Code of Federal Regulations (10...

  5. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  6. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  7. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  8. 77 FR 2435 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-18

    ...- Free Treatment Under the Generalized System of Preferences and for Other Purposes Correction In... following correction: On page 407, the date following the proclamation number should read ``December...

  9. 78 FR 2193 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-10

    ... United States-Panama Trade Promotion Agreement and for Other Purposes Correction In Presidential document... correction: On page 66507, the proclamation identification heading on line one should read...

  10. Multimode model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.; Gregory, E.

    2016-02-01

    A newly-initiated research program for model-based defect characterization in CFRP composites is summarized. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing delamination and porosity. Forward predictions of measurement response are presented, as well as examples of model-based inversion of measured data for the estimation of defect parameters.

  11. [Spigelian hernia: clinical, diagnostic and therapeutical aspects].

    PubMed

    Versaci, A; Rossitto, M; Centorrino, T; Barbera, A; Fonti, M T; Broccio, M; Ciccolo, A

    1998-01-01

    The Authors describing a case of Spigelian hernia observed point out clinical, diagnostic and therapeutic considerations about this rare pathology of abdominal wall. They specify the anatomic characteristics of the region and underline as any diagnostic difficulties are by passed by use of USG and TC imaging for formulation of correct preoperative diagnosis. They confirm as surgical treatment by a correct access isn't different by a normal hernioplasty and guarantee the long term surgical outcome.

  12. Reducing Centroid Error Through Model-Based Noise Reduction

    NASA Technical Reports Server (NTRS)

    Lee, Shinhak

    2006-01-01

    A method of processing the digitized output of a charge-coupled device (CCD) image detector has been devised to enable reduction of the error in computed centroid of the image of a point source of light. The method involves model-based estimation of, and correction for, the contributions of bias and noise to the image data. The method could be used to advantage in any of a variety of applications in which there are requirements for measuring precise locations of, and/or precisely aiming optical instruments toward, point light sources. In the present method, prior to normal operations of the CCD, one measures the point-spread function (PSF) of the telescope or other optical system used to project images on the CCD. The PSF is used to construct a database of spot models representing the nominal CCD pixel outputs for a point light source projected onto the CCD at various positions incremented by small fractions of a pixel.

  13. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  14. Model-based Processing of Microcantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Sinensky, A K; Lee, C L; Rudd, R E; Burnham, A K

    2005-04-27

    We have developed a model-based processor (MBP) for a microcantilever-array sensor to detect target species in solution. We perform a proof-of-concept experiment, fit model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest, averaged deflection data and multi-channel data. For this evaluation we extract model parameters via a model-based estimation, perform a Gauss-Markov simulation, design the optimal MBP and apply it to measured experimental data. The performance of the MBP in the multi-channel case is evaluated by comparison to a ''smoother'' (averager) typically used for microcantilever signal analysis. It is shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, apart from a correctable systematic bias error.

  15. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  16. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  17. Model-based Training of Situated Skills.

    ERIC Educational Resources Information Center

    Khan, Tariq M.; Brown, Keith

    2000-01-01

    Addresses areas of situated knowledge (metacognitive skills and affective skills) that have been ignored in intelligent computer-aided learning systems. Focuses on model-based reasoning, including contextualized and decontextualized knowledge, and examines an instructional method that supports situated knowledge by providing opportunities for…

  18. Mitochondrial diseases: therapeutic approaches.

    PubMed

    DiMauro, Salvatore; Mancuso, Michelangelo

    2007-06-01

    Therapy of mitochondrial encephalomyopathies (defined restrictively as defects of the mitochondrial respiratory chain) is woefully inadequate, despite great progress in our understanding of the molecular bases of these disorders. In this review, we consider sequentially several different therapeutic approaches. Palliative therapy is dictated by good medical practice and includes anticonvulsant medication, control of endocrine dysfunction, and surgical procedures. Removal of noxious metabolites is centered on combating lactic acidosis, but extends to other metabolites. Attempts to bypass blocks in the respiratory chain by administration of electron acceptors have not been successful, but this may be amenable to genetic engineering. Administration of metabolites and cofactors is the mainstay of real-life therapy and is especially important in disorders due to primary deficiencies of specific compounds, such as carnitine or coenzyme Q10. There is increasing interest in the administration of reactive oxygen species scavengers both in primary mitochondrial diseases and in neurodegenerative diseases directly or indirectly related to mitochondrial dysfunction. Aerobic exercise and physical therapy prevent or correct deconditioning and improve exercise tolerance in patients with mitochondrial myopathies due to mitochondrial DNA (mtDNA) mutations. Gene therapy is a challenge because of polyplasmy and heteroplasmy, but interesting experimental approaches are being pursued and include, for example, decreasing the ratio of mutant to wild-type mitochondrial genomes (gene shifting), converting mutated mtDNA genes into normal nuclear DNA genes (allotopic expression), importing cognate genes from other species, or correcting mtDNA mutations with specific restriction endonucleases. Germline therapy raises ethical problems but is being considered for prevention of maternal transmission of mtDNA mutations. Preventive therapy through genetic counseling and prenatal diagnosis is

  19. TPX correction coil studies

    SciTech Connect

    Hanson, J.D.

    1994-11-03

    Error correction coils are planned for the TPX (Tokamak Plasma Experiment) in order to avoid error field induced locked modes and disruption. The FT (Fix Tokamak) code is used to evaluate the ability of these correction coils to remove islands caused by symmetry breaking magnetic field errors. The proposed correction coils are capable of correcting a variety of error fields.

  20. Efficient Model-Based Diagnosis Engine

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  1. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  2. Model-based 3D SAR reconstruction

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Gunther, Jake; Moon, Todd

    2014-06-01

    Three dimensional scene reconstruction with synthetic aperture radar (SAR) is desirable for target recognition and improved scene interpretability. The vertical aperture, which is critical to reconstruct 3D SAR scenes, is almost always sparsely sampled due to practical limitations, which creates an underdetermined problem. This papers explores 3D scene reconstruction using a convex model-based approach. The approach developed is demonstrated on 3D scenes, but can be extended to SAR reconstruction of sparsely sampled signals in the spatial and, or, frequency domains. The model-based approach enables knowledge-aided image formation (KAIF) by incorporating spatial, aspect, and sparsity magnitude terms into the image reconstruction. The incorporation of these terms, which are based on prior scene knowledge, will demonstrate improved results compared to traditional image formation algorithms. The SAR image formation problem is formulated as a second order cone program (SOCP) and the results are demonstrated on 3D scenes using simulated data and data from the GOTCHA data collect.1 The model-based results are contrasted against traditional backprojected images.

  3. TU-G-210-02: TRANS-FUSIMO - An Integrative Approach to Model-Based Treatment Planning of Liver FUS

    SciTech Connect

    Preusser, T.

    2015-06-15

    Modeling can play a vital role in predicting, optimizing and analyzing the results of therapeutic ultrasound treatments. Simulating the propagating acoustic beam in various targeted regions of the body allows for the prediction of the resulting power deposition and temperature profiles. In this session we will apply various modeling approaches to breast, abdominal organ and brain treatments. Of particular interest is the effectiveness of procedures for correcting for phase aberrations caused by intervening irregular tissues, such as the skull in transcranial applications or inhomogeneous breast tissues. Also described are methods to compensate for motion in targeted abdominal organs such as the liver or kidney. Douglas Christensen – Modeling for Breast and Brain HIFU Treatment Planning Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Learning Objectives: Understand the role of acoustic beam modeling for predicting the effectiveness of therapeutic ultrasound treatments. Apply acoustic modeling to specific breast, liver, kidney and transcranial anatomies. Determine how to obtain appropriate acoustic modeling parameters from clinical images. Understand the separate role of absorption and scattering in energy delivery to tissues. See how organ motion can be compensated for in ultrasound therapies. Compare simulated data with clinical temperature measurements in transcranial applications. Supported by NIH R01 HL172787 and R01 EB013433 (DC); EU Seventh Framework Programme (FP7/2007-2013) under 270186 (FUSIMO) and 611889 (TRANS-FUSIMO)(TP); and P01 CA159992, GE, FUSF and InSightec (UV)

  4. 75 FR 18747 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... Day: A National Day of Celebration of Greek and American Democracy, 2010 Correction In Presidential... correction: On page 15601, the first line of the heading should read ``Proclamation 8485 of March 24,...

  5. 77 FR 45469 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ... Respect to the Former Liberian Regime of Charles Taylor Correction In Presidential document 2012-17703 beginning on page 42415 in the issue of Wednesday, July 18, 2012, make the following correction: On...

  6. 78 FR 7255 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... Unobligated Funds Under the American Recovery and Reinvestment Act of 2009 Correction In Presidential document... correction: On page 70883, the document identification heading on line one should read ``Notice of...

  7. 75 FR 68413 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Correction In Presidential document 2010-27676 beginning on page 67019 in the issue of Monday, November 1, 2010, make the following correction: On page 67019, the Presidential Determination number should...

  8. 75 FR 1013 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-08

    ... Correction In Presidential document E9-31418 beginning on page 707 in the issue of Tuesday, January 5, 2010, make the following correction: On page 731, the date line below the President's signature should...

  9. 75 FR 68409 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Migration Needs Resulting From Flooding In Pakistan Correction In Presidential document 2010-27673 beginning on page 67015 in the issue of Monday, November 1, 2010, make the following correction: On page...

  10. 78 FR 73377 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-06

    ...--Continuation of U.S. Drug Interdiction Assistance to the Government of Colombia Correction In Presidential... correction: On page 51647, the heading of the document was omitted and should read ``Continuation of...

  11. 77 FR 60037 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... Commit, Threaten To Commit, or Support Terrorism Correction In Presidential document 2012-22710 beginning on page 56519 in the issue of Wednesday, September 12, 2012, make the following correction: On...

  12. 75 FR 68407 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Migration Needs Resulting from Violence in Kyrgyzstan Correction In Presidential document 2010-27672 beginning on page 67013 in the issue of Monday, November 1, 2010, make the following correction: On...

  13. Research in Correctional Rehabilitation.

    ERIC Educational Resources Information Center

    Rehabilitation Services Administration (DHEW), Washington, DC.

    Forty-three leaders in corrections and rehabilitation participated in the seminar planned to provide an indication of the status of research in correctional rehabilitation. Papers include: (1) "Program Trends in Correctional Rehabilitation" by John P. Conrad, (2) "Federal Offenders Rahabilitation Program" by Percy B. Bell and Merlyn Mathews, (3)…

  14. Lower-order effects adjustment in quantitative traits model-based multifactor dimensionality reduction.

    PubMed

    Mahachie John, Jestinah M; Cattaert, Tom; Lishout, François Van; Gusareva, Elena S; Steen, Kristel Van

    2012-01-01

    Identifying gene-gene interactions or gene-environment interactions in studies of human complex diseases remains a big challenge in genetic epidemiology. An additional challenge, often forgotten, is to account for important lower-order genetic effects. These may hamper the identification of genuine epistasis. If lower-order genetic effects contribute to the genetic variance of a trait, identified statistical interactions may simply be due to a signal boost of these effects. In this study, we restrict attention to quantitative traits and bi-allelic SNPs as genetic markers. Moreover, our interaction study focuses on 2-way SNP-SNP interactions. Via simulations, we assess the performance of different corrective measures for lower-order genetic effects in Model-Based Multifactor Dimensionality Reduction epistasis detection, using additive and co-dominant coding schemes. Performance is evaluated in terms of power and familywise error rate. Our simulations indicate that empirical power estimates are reduced with correction of lower-order effects, likewise familywise error rates. Easy-to-use automatic SNP selection procedures, SNP selection based on "top" findings, or SNP selection based on p-value criterion for interesting main effects result in reduced power but also almost zero false positive rates. Always accounting for main effects in the SNP-SNP pair under investigation during Model-Based Multifactor Dimensionality Reduction analysis adequately controls false positive epistasis findings. This is particularly true when adopting a co-dominant corrective coding scheme. In conclusion, automatic search procedures to identify lower-order effects to correct for during epistasis screening should be avoided. The same is true for procedures that adjust for lower-order effects prior to Model-Based Multifactor Dimensionality Reduction and involve using residuals as the new trait. We advocate using "on-the-fly" lower-order effects adjusting when screening for SNP-SNP interactions

  15. Generalizing on Multiple Grounds: Performance Learning in Model-Based Troubleshooting

    DTIC Science & Technology

    1989-02-01

    several well-known advntages over heuristic expert systems. These include correctness of conclusions, explalations of conclusions, ease of modifiability...Introduction Consider a model-based diagnostic engine. Given a structural and behavioral description of a device, and a set of observed measurements at...single-fault candidates. That is, either multiplier Ml or adder Al alone could, by some misbehavior, account for all of the observed misbehavior of the

  16. Model-based neuroimaging for cognitive computing.

    PubMed

    Poznanski, Roman R

    2009-09-01

    The continuity of the mind is suggested to mean the continuous spatiotemporal dynamics arising from the electrochemical signature of the neocortex: (i) globally through volume transmission in the gray matter as fields of neural activity, and (ii) locally through extrasynaptic signaling between fine distal dendrites of cortical neurons. If the continuity of dynamical systems across spatiotemporal scales defines a stream of consciousness then intentional metarepresentations as templates of dynamic continuity allow qualia to be semantically mapped during neuroimaging of specific cognitive tasks. When interfaced with a computer, such model-based neuroimaging requiring new mathematics of the brain will begin to decipher higher cognitive operations not possible with existing brain-machine interfaces.

  17. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  18. Model-based Tomographic Reconstruction Literature Search

    SciTech Connect

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  19. Model-based multiple patterning layout decomposition

    NASA Astrophysics Data System (ADS)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  20. Model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.

    2017-02-01

    Work is reported on model-based defect characterization in CFRP composites. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing multi-ply impact-induced delamination, with application in this paper focusing on ultrasound. A companion paper in these proceedings summarizes corresponding activity in thermography. Inversion of ultrasound data is demonstrated showing the quantitative extraction of damage properties.

  1. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  2. Enzyme therapeutics for systemic detoxification.

    PubMed

    Liu, Yang; Li, Jie; Lu, Yunfeng

    2015-08-01

    Life relies on numerous biochemical processes working synergistically and correctly. Certain substances disrupt these processes, inducing living organism into an abnormal state termed intoxication. Managing intoxication usually requires interventions, which is referred as detoxification. Decades of development on detoxification reveals the potential of enzymes as ideal therapeutics and antidotes, because their high substrate specificity and catalytic efficiency are essential for clearing intoxicating substances without adverse effects. However, intrinsic shortcomings of enzymes including low stability and high immunogenicity are major hurdles, which could be overcome by delivering enzymes with specially designed nanocarriers. Extensive investigations on protein delivery indicate three types of enzyme-nanocarrier architectures that show more promise than others for systemic detoxification, including liposome-wrapped enzymes, polymer-enzyme conjugates, and polymer-encapsulated enzymes. This review highlights recent advances in these nano-architectures and discusses their applications in systemic detoxifications. Therapeutic potential of various enzymes as well as associated challenges in achieving effective delivery of therapeutic enzymes will also be discussed.

  3. Model-based Processing of Micro-cantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Lee, C L; Rudd, R E; Burnham, A K

    2004-11-17

    We develop a model-based processor (MBP) for a micro-cantilever array sensor to detect target species in solution. After discussing the generalized framework for this problem, we develop the specific model used in this study. We perform a proof-of-concept experiment, fit the model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest: (1) averaged deflection data, and (2) multi-channel data. In both cases the evaluation proceeds by first performing a model-based parameter estimation to extract the model parameters, next performing a Gauss-Markov simulation, designing the optimal MBP and finally applying it to measured experimental data. The simulation is used to evaluate the performance of the MBP in the multi-channel case and compare it to a ''smoother'' (''averager'') typically used in this application. It was shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, though it includes a correctable systematic bias error. The project's primary accomplishment was the successful application of model-based processing to signals from micro-cantilever arrays: 40-60 dB improvement vs. the smoother algorithm was demonstrated. This result was achieved through the development of appropriate mathematical descriptions for the chemical and mechanical phenomena, and incorporation of these descriptions directly into the model-based signal processor. A significant challenge was the development of the framework which would maximize the usefulness of the signal processing algorithms while ensuring the accuracy of the mathematical description of the chemical-mechanical signal. Experimentally, the difficulty was to identify and characterize the non

  4. Feature-driven model-based segmentation

    NASA Astrophysics Data System (ADS)

    Qazi, Arish A.; Kim, John; Jaffray, David A.; Pekar, Vladimir

    2011-03-01

    The accurate delineation of anatomical structures is required in many medical image analysis applications. One example is radiation therapy planning (RTP), where traditional manual delineation is tedious, labor intensive, and can require hours of clinician's valuable time. Majority of automated segmentation methods in RTP belong to either model-based or atlas-based approaches. One substantial limitation of model-based segmentation is that its accuracy may be restricted by the uncertainties in image content, specifically when segmenting low-contrast anatomical structures, e.g. soft tissue organs in computed tomography images. In this paper, we introduce a non-parametric feature enhancement filter which replaces raw intensity image data by a high level probabilistic map which guides the deformable model to reliably segment low-contrast regions. The method is evaluated by segmenting the submandibular and parotid glands in the head and neck region and comparing the results to manual segmentations in terms of the volume overlap. Quantitative results show that we are in overall good agreement with expert segmentations, achieving volume overlap of up to 80%. Qualitatively, we demonstrate that we are able to segment low-contrast regions, which otherwise are difficult to delineate with deformable models relying on distinct object boundaries from the original image data.

  5. Sandboxes for Model-Based Inquiry

    NASA Astrophysics Data System (ADS)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  6. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  7. Development of explicit diffraction corrections for absolute measurements of acoustic nonlinearity parameters in the quasilinear regime.

    PubMed

    Jeong, Hyunjo; Zhang, Shuzeng; Cho, Sungjong; Li, Xiongbing

    2016-08-01

    In absolute measurements of acoustic nonlinearity parameters, amplitudes of harmonics must be corrected for diffraction effects. In this study, we develop explicit multi-Gaussian beam (MGB) model-based diffraction corrections for the first three harmonics in weakly nonlinear, axisymmetric sound beams. The effects of making diffraction corrections on nonlinearity parameter estimation are investigated by defining "total diffraction correction (TDC)". The results demonstrate that TDC cannot be neglected even for harmonic generation experiments in the nearfield region.

  8. A Cognitive Model Based on Neuromodulated Plasticity

    PubMed Central

    Ruan, Xiaogang

    2016-01-01

    Associative learning, including classical conditioning and operant conditioning, is regarded as the most fundamental type of learning for animals and human beings. Many models have been proposed surrounding classical conditioning or operant conditioning. However, a unified and integrated model to explain the two types of conditioning is much less studied. Here, a model based on neuromodulated synaptic plasticity is presented. The model is bioinspired including multistored memory module and simulated VTA dopaminergic neurons to produce reward signal. The synaptic weights are modified according to the reward signal, which simulates the change of associative strengths in associative learning. The experiment results in real robots prove the suitability and validity of the proposed model. PMID:27872638

  9. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  10. Model-based reconfiguration: Diagnosis and recovery

    NASA Technical Reports Server (NTRS)

    Crow, Judy; Rushby, John

    1994-01-01

    We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.

  11. Request for Correction 10003

    EPA Pesticide Factsheets

    Letter from Jeff Rush requesting rescinding and correction online and printed information regarding alleged greenhouse gas emissions reductions resulting from beneficial use of coal combustion waste products.

  12. 78 FR 55169 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-10

    ... Commodities and Services From Any Agency of the United States Government to the Syrian Opposition Coalition (SOC) and the Syrian Opposition's Supreme Military Council (SMC) Correction In Presidential...

  13. A Generative Control Capability for a Model-based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Nayak, P. Pandurang

    1997-01-01

    This paper describes Burton, a core element of a new generation of goal-directed model-based autonomous executives. This executive makes extensive use of component-based declarative models to analyze novel situations and generate novel control actions both at the goal and hardware levels. It uses an extremely efficient online propositional inference engine to efficiently determine likely states consistent with current observations and optimal target states that achieve high level goals. It incorporates a flexible generative control sequencing algorithm within the reactive loop to bridge the gap between current and target states. The system is able to detect and avoid damaging and irreversible situations, After every control action it uses its model and sensors to detect anomalous situations and immediately take corrective action. Efficiency is achieved through a series of model compilation and online policy construction methods, and by exploiting general conventions of hardware design that permit a divide and conquer approach to planning. The paper presents a formal characterization of Burton's capability, develops efficient algorithms, and reports on experience with the implementation in the domain of spacecraft autonomy. Burton is being incorporated as one of the key elements of the Remote Agent core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millenium program.

  14. Suspected myelinolysis following rapid correction of hyponatremia in a dog.

    PubMed

    Churcher, R K; Watson, A D; Eaton, A

    1999-01-01

    A dog developed signs of neurological dysfunction five days after rapid correction of severe electrolyte derangements, including hyponatremia, caused by gastrointestinal parasitism (i.e., trichuriasis). History, laboratory findings, and onset of neurological signs following correction of hyponatremia led to a diagnosis of myelinolysis. Myelinolysis is a noninflammatory, demyelinating brain disease caused by sudden, upward osmotic shifts in central nervous system plasma, often a result of rapid correction of chronic hyponatremia. The pathogenesis is complex, but recovery is possible. Iatrogenic damage due to myelinolysis can be avoided by adherence to therapeutic guidelines for correction of chronic hyponatremia.

  15. [Diagnostic-therapeutic approach for retroperitoneal tumors].

    PubMed

    Cariati, A

    1993-12-01

    After a careful review of the Literature, diagnostic and therapeutic strategies for Primary Retroperitoneal Tumours (PRT) are reported. The Author analyzes the experience of the Institute of Clinica Chirurgica "R" (Chief: Prof. E. Tosatti) as well as that of Anatomia Chirurgica (Chief: Prof. E. Cariati),--University of Genoa--in the management of PRT, stressing the importance of preoperative staging for a correct surgical approach.

  16. The fast correction coil feedback control system

    SciTech Connect

    Coffield, F.; Caporaso, G.; Zentler, J.M.

    1989-01-01

    A model-based feedback control system has been developed to correct beam displacement errors in the Advanced Test Accelerator (ATA) electron beam accelerator. The feedback control system drives an X/Y dipole steering system that has a 40-MHz bandwidth and can produce {+-}300-Gauss-cm dipole fields. A simulator was used to develop the control algorithm and to quantify the expected performance in the presence of beam position measurement noise and accelerator timing jitter. The major problem to date has been protecting the amplifiers from the voltage that is inductively coupled to the steering bars by the beam. 3 refs., 8 figs.

  17. Trends in Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Smith, Ralph W.

    1995-01-01

    Discusses the implications of the rapid, dramatic changes taking place in therapeutic recreation for individuals with physical disabilities. The article notes the impact of changes in managed care, examines programming trends in therapeutic recreation (adventure/outdoor education, competitive sports, handcycling, health enhancement activities, and…

  18. Therapeutic Recreation Practicum Manual.

    ERIC Educational Resources Information Center

    Schneegas, Kay

    This manual provides information on the practicum program offered by Moraine Valley Community College (MVCC) for students in its therapeutic recreation program. Sections I and II outline the rationale and goals for providing practical, on-the-job work experiences for therapeutic recreation students. Section III specifies MVCC's responsibilities…

  19. Chicanoizing the Therapeutic Community

    ERIC Educational Resources Information Center

    Aron, William S.; And Others

    1974-01-01

    Focusing on the drug addiction problem and its antecedent conditions in a Chicano population, the article examines several therapeutic interventions suggested by these conditions and indicates how they might be incorporated into a drug addiction Therapeutic Community treatment program designed to meet the needs of Chicano drug addicts. (Author/NQ)

  20. Impact of Therapeutic Camping

    ERIC Educational Resources Information Center

    Shniderman, Craig M.

    1974-01-01

    There has been little interest in, and only slight illumination of, the impact of therapeutic camping for emotionally disturbed children. This study seeks to validate the belief that camping is therapeutic. Subjects were 52 boys, 5 to 11 1/2 years of age. Results support the hypothesis. (Author/HMV)

  1. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  2. Model-based target and background characterization

    NASA Astrophysics Data System (ADS)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  3. Model-Based Estimation of Knee Stiffness

    PubMed Central

    Pfeifer, Serge; Vallery, Heike; Hardegger, Michael; Riener, Robert; Perreault, Eric J.

    2013-01-01

    During natural locomotion, the stiffness of the human knee is modulated continuously and subconsciously according to the demands of activity and terrain. Given modern actuator technology, powered transfemoral prostheses could theoretically provide a similar degree of sophistication and function. However, experimentally quantifying knee stiffness modulation during natural gait is challenging. Alternatively, joint stiffness could be estimated in a less disruptive manner using electromyography (EMG) combined with kinetic and kinematic measurements to estimate muscle force, together with models that relate muscle force to stiffness. Here we present the first step in that process, where we develop such an approach and evaluate it in isometric conditions, where experimental measurements are more feasible. Our EMG-guided modeling approach allows us to consider conditions with antagonistic muscle activation, a phenomenon commonly observed in physiological gait. Our validation shows that model-based estimates of knee joint stiffness coincide well with experimental data obtained using conventional perturbation techniques. We conclude that knee stiffness can be accurately estimated in isometric conditions without applying perturbations, which presents an important step towards our ultimate goal of quantifying knee stiffness during gait. PMID:22801482

  4. 3-D model-based vehicle tracking.

    PubMed

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  5. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  6. Model-Based Wavefront Control for CCAT

    NASA Technical Reports Server (NTRS)

    Redding, David; Lou, John Z.; Kissil, Andy; Bradford, Matt; Padin, Steve; Woody, David

    2011-01-01

    The 25-m aperture CCAT submillimeter-wave telescope will have a primary mirror that is divided into 162 individual segments, each of which is provided with 3 positioning actuators. CCAT will be equipped with innovative Imaging Displacement Sensors (IDS) inexpensive optical edge sensors capable of accurately measuring all segment relative motions. These measurements are used in a Kalman-filter-based Optical State Estimator to estimate wavefront errors, permitting use of a minimum-wavefront controller without direct wavefront measurement. This controller corrects the optical impact of errors in 6 degrees of freedom per segment, including lateral translations of the segments, using only the 3 actuated degrees of freedom per segment. The global motions of the Primary and Secondary Mirrors are not measured by the edge sensors. These are controlled using a gravity-sag look-up table. Predicted performance is illustrated by simulated response to errors such as gravity sag.

  7. Laser correcting mirror

    DOEpatents

    Sawicki, Richard H.

    1994-01-01

    An improved laser correction mirror (10) for correcting aberrations in a laser beam wavefront having a rectangular mirror body (12) with a plurality of legs (14, 16, 18, 20, 22, 24, 26, 28) arranged into opposing pairs (34, 36, 38, 40) along the long sides (30, 32) of the mirror body (12). Vector force pairs (49, 50, 52, 54) are applied by adjustment mechanisms (42, 44, 46, 48) between members of the opposing pairs (34, 36, 38, 40) for bending a reflective surface 13 of the mirror body 12 into a shape defining a function which can be used to correct for comatic aberrations.

  8. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  9. Statistical appearance models based on probabilistic correspondences.

    PubMed

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2017-04-01

    Model-based image analysis is indispensable in medical image processing. One key aspect of building statistical shape and appearance models is the determination of one-to-one correspondences in the training data set. At the same time, the identification of these correspondences is the most challenging part of such methods. In our earlier work, we developed an alternative method using correspondence probabilities instead of exact one-to-one correspondences for a statistical shape model (Hufnagel et al., 2008). In this work, a new approach for statistical appearance models without one-to-one correspondences is proposed. A sparse image representation is used to build a model that combines point position and appearance information at the same time. Probabilistic correspondences between the derived multi-dimensional feature vectors are used to omit the need for extensive preprocessing of finding landmarks and correspondences as well as to reduce the dependence of the generated model on the landmark positions. Model generation and model fitting can now be expressed by optimizing a single global criterion derived from a maximum a-posteriori (MAP) approach with respect to model parameters that directly affect both shape and appearance of the considered objects inside the images. The proposed approach describes statistical appearance modeling in a concise and flexible mathematical framework. Besides eliminating the demand for costly correspondence determination, the method allows for additional constraints as topological regularity in the modeling process. In the evaluation the model was applied for segmentation and landmark identification in hand X-ray images. The results demonstrate the feasibility of the model to detect hand contours as well as the positions of the joints between finger bones for unseen test images. Further, we evaluated the model on brain data of stroke patients to show the ability of the proposed model to handle partially corrupted data and to

  10. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  11. Corrected Age for Preemies

    MedlinePlus

    ... Spread the Word Shop AAP Find a Pediatrician Ages & Stages Prenatal Baby Bathing & Skin Care Breastfeeding Crying & ... Listen Español Text Size Email Print Share Corrected Age For Preemies Page Content Article Body If your ...

  12. Biomimetic Particles as Therapeutics

    PubMed Central

    Green, Jordan J.

    2015-01-01

    In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289

  13. Biomimetic particles as therapeutics.

    PubMed

    Meyer, Randall A; Sunshine, Joel C; Green, Jordan J

    2015-09-01

    In recent years, there have been major advances in the development of novel nanoparticle- and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health.

  14. Quantum Error Correction

    NASA Astrophysics Data System (ADS)

    Lidar, Daniel A.; Brun, Todd A.

    2013-09-01

    Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and

  15. Adaptable DC offset correction

    NASA Technical Reports Server (NTRS)

    Golusky, John M. (Inventor); Muldoon, Kelly P. (Inventor)

    2009-01-01

    Methods and systems for adaptable DC offset correction are provided. An exemplary adaptable DC offset correction system evaluates an incoming baseband signal to determine an appropriate DC offset removal scheme; removes a DC offset from the incoming baseband signal based on the appropriate DC offset scheme in response to the evaluated incoming baseband signal; and outputs a reduced DC baseband signal in response to the DC offset removed from the incoming baseband signal.

  16. Model-based wavefront sensorless adaptive optics system for large aberrations and extended objects.

    PubMed

    Yang, Huizhen; Soloviev, Oleg; Verhaegen, Michel

    2015-09-21

    A model-based wavefront sensorless (WFSless) adaptive optics (AO) system with a 61-element deformable mirror is simulated to correct the imaging of a turbulence-degraded extended object. A fast closed-loop control algorithm, which is based on the linear relation between the mean square of the aberration gradients and the second moment of the image intensity distribution, is used to generate the control signals for the actuators of the deformable mirror (DM). The restoration capability and the convergence rate of the AO system are investigated with different turbulence strength wave-front aberrations. Simulation results show the model-based WFSless AO system can restore those images degraded by different turbulence strengths successfully and obtain the correction very close to the achievable capability of the given DM. Compared with the ideal correction of 61-element DM, the averaged relative error of RMS value is 6%. The convergence rate of AO system is independent of the turbulence strength and only depends on the number of actuators of DM.

  17. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  18. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  19. A Nonhydrostatic Model Based On A New Approach

    NASA Astrophysics Data System (ADS)

    Janjic, Z. I.

    Considerable experience with nonhydrostatic mo dels has been accumulated on the scales of convective clouds and storms. However, numerical weather prediction (NWP) deals with motions on a much wider range of temporal and spatial scales. Thus, difficulties that may not be significant on the small scales, may become important in NWP applications. Having in mind these considerations, a new approach has been proposed and applied in developing nonhydrostatic models intended for NWP applications. Namely, instead of extending the cloud models to synoptic scales, the hydrostatic approximation is relaxed in a hydrostatic NWP model. In this way the model validity is extended to nonhydrostatic motions, and at the same time favorable features of the hydrostatic formulation are preserved. In order to apply this approach, the system of nonhydrostatic equations is split into two parts: (a) the part that corresponds to the hydrostatic system, except for corrections due to vertical acceleration, and (b) the system of equations that allows computation of the corrections appearing in the first system. This procedure does not require any additional approximation. In the model, "isotropic" horizontal finite differencing is employed that conserves a number of basic and derived dynamical and quadratic quantities. The hybrid pressure-sigma vertical coordinate has been chosen as the primary option. The forward-backward scheme is used for horizontally propagating fast waves, and an implicit scheme is used for vertically propagating sound waves. The Adams- Bashforth scheme is applied for the advection of the basic dynamical variables and for the Coriolis terms. In real data runs, the nonhydrostatic dynamics does not require extra computational boundary conditions at the top. The philosophy of the physical package and possible future developments of physical parameterizations are also reviewed. A two-dimensional model based on the described approach successfully reproduced classical

  20. A satellite and model based flood inundation climatology of Australia

    NASA Astrophysics Data System (ADS)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  1. MACE: model based analysis of ChIP-exo

    PubMed Central

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J.; Zimmermann, Michael T.; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T.; Huang, Haojie; Wilson, Michael D.; Kocher, Jean-Pierre A.; Li, Wei

    2014-01-01

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  2. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  3. Cognitive components underpinning the development of model-based learning.

    PubMed

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2016-10-29

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning.

  4. Target Mass Corrections Revisited

    SciTech Connect

    W. Melnitchouk; F. Steffens

    2006-03-07

    We propose a new implementation of target mass corrections to nucleon structure functions which, unlike existing treatments, has the correct kinematic threshold behavior at finite Q{sup 2} in the x {yields} 1 limit. We illustrate the differences between the new approach and existing prescriptions by considering specific examples for the F{sub 2} and F{sub L} structure functions, and discuss the broader implications of our results, which call into question the notion of universal parton distribution at finite Q{sup 2}.

  5. Corrective midfoot osteotomies.

    PubMed

    Stapleton, John J; DiDomenico, Lawrence A; Zgonis, Thomas

    2008-10-01

    Corrective midfoot osteotomies involve complete separation of the forefoot and hindfoot through the level of the midfoot, followed by uni-, bi-, or triplanar realignment and arthrodesis. This technique can be performed through various approaches; however, in the high-risk patient, percutaneous and minimum incision techniques are necessary to limit the potential of developing soft tissue injury. These master level techniques require extensive surgical experience and detailed knowledge of lower extremity biomechanics. The authors discuss preoperative clinical and radiographic evaluation, specific operative techniques used, and postoperative management for the high-risk patient undergoing corrective midfoot osteotomy.

  6. Correction of ocular dystopia.

    PubMed

    Janecka, I P

    1996-04-01

    The purpose of this study was to examine results with elective surgical correction of enophthalmos. The study was a retrospective assessment in a university-based referral practice. A consecutive sample of 10 patients who developed ocular dystopia following orbital trauma was examined. The main outcome measures were a subjective evaluation by patients and objective measurements of patients' eye position. The intervention was three-dimensional orbital reconstruction with titanium plates. It is concluded that satisfactory correction of enophthalmos and ocular dystopia can be achieved with elective surgery using titanium plates. In addition, intraoperative measurements of eye position in three planes increases the precision of surgery.

  7. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Optical measurements of range and elevation angle are distorted by the earth's atmosphere. High precision refraction correction equations are presented which are ideally suited for surveying because their inputs are optically measured range and optically measured elevation angle. The outputs are true straight line range and true geometric elevation angle. The 'short distances' used in surveying allow the calculations of true range and true elevation angle to be quickly made using a programmable pocket calculator. Topics covered include the spherical form of Snell's Law; ray path equations; and integrating the equations. Short-, medium-, and long-range refraction corrections are presented in tables.

  8. Correction coil cable

    DOEpatents

    Wang, S.T.

    1994-11-01

    A wire cable assembly adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies for the Superconducting Super Collider. The correction coil cables have wires collected in wire array with a center rib sandwiched therebetween to form a core assembly. The core assembly is surrounded by an assembly housing having an inner spiral wrap and a counter wound outer spiral wrap. An alternate embodiment of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable on a particle tube in a particle tube assembly. 7 figs.

  9. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  10. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  11. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  12. The Effect of Modeling Based Science Education on Critical Thinking

    ERIC Educational Resources Information Center

    Bati, Kaan; Kaptan, Fitnat

    2015-01-01

    In this study to what degree the modeling based science education can influence the development of the critical thinking skills of the students was investigated. The research was based on pre-test-post-test quasi-experimental design with control group. The Modeling Based Science Education Program which was prepared with the purpose of exploring…

  13. Issues in Correctional Training and Casework. Correctional Monograph.

    ERIC Educational Resources Information Center

    Wolford, Bruce I., Ed.; Lawrenz, Pam, Ed.

    The eight papers contained in this monograph were drawn from two national meetings on correctional training and casework. Titles and authors are: "The Challenge of Professionalism in Correctional Training" (Michael J. Gilbert); "A New Perspective in Correctional Training" (Jack Lewis); "Reasonable Expectations in Correctional Officer Training:…

  14. Rethinking Correctional Staff Development.

    ERIC Educational Resources Information Center

    Williams, David C.

    There have been enduring conflicts in correctional institutions between personnel charged with rehabilitative duties and those who oversee authority. It is only within the past few years that realistic communication between these groups has been tolerated. The same period of time has been characterized by the infusion of training and staff…

  15. Thermodynamically Correct Bioavailability Estimations

    DTIC Science & Technology

    1992-04-30

    6448 I 1. SWPPUMENTA* NOTIS lIa. OISTUAMJTiOAVAILAIILTY STATIMENT 121 OT REbT ostwosCo z I Approved for public release; distribution unlimited... research is to develop thermodynamically correct bioavailability estimations using chromatographic stationary phases as a model of the "interphase

  16. Errors and Their Corrections

    ERIC Educational Resources Information Center

    Joosten, Albert Max

    2016-01-01

    "Our primary concern is not that the child learns to do something without mistakes. Our real concern is that the child does what he needs, with interest." The reaction of so many adults to the mistakes of children is to correct, immediately and directly, says Joosten. To truly aid the child in development, we must learn to control our…

  17. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  18. New Directions in Corrections.

    ERIC Educational Resources Information Center

    McKee, John M.

    A picture of the American prison situation in the past and in its present changing form is presented. The object of the correctional community is becoming more and more that of successfully reintegrating the ex-offender into the social community from which he has been separated. It is predicted that within the next five years: (1) Every state will…

  19. Pluristem Therapeutics, Inc.

    PubMed

    Prather, William

    2008-01-01

    Pluristem Therapeutics, Inc., based in Haifa, Israel, is a regenerative, biotherapeutics Company dedicated to the commercialization of nonpersonalized (allogeneic) cell therapy products. The Company is expanding noncontroversial placental-derived mesenchymal stem cells via a proprietary 3D process, named PluriX, into therapeutics for a variety of degenerative, malignant and autoimmune disorders. Pluristem will be conducting Phase I trials in the USA with its first product, PLX-I, which addresses the global shortfall of matched tissue for bone marrow transplantation by improving the engraftment of hematopoietic stem cells contained in umbilical cord blood.

  20. Therapeutics for cognitive aging.

    PubMed

    Shineman, Diana W; Salthouse, Timothy A; Launer, Lenore J; Hof, Patrick R; Bartzokis, George; Kleiman, Robin; Luine, Victoria; Buccafusco, Jerry J; Small, Gary W; Aisen, Paul S; Lowe, David A; Fillit, Howard M

    2010-04-01

    This review summarizes the scientific talks presented at the conference "Therapeutics for Cognitive Aging," hosted by the New York Academy of Sciences and the Alzheimer's Drug Discovery Foundation on May 15, 2009. Attended by scientists from industry and academia, as well as by a number of lay people-approximately 200 in all-the conference specifically tackled the many aspects of developing therapeutic interventions for cognitive impairment. Discussion also focused on how to define cognitive aging and whether it should be considered a treatable, tractable disease.

  1. DELIVERY OF THERAPEUTIC PROTEINS

    PubMed Central

    Pisal, Dipak S.; Kosloski, Matthew P.; Balu-Iyer, Sathy V.

    2009-01-01

    The safety and efficacy of protein therapeutics are limited by three interrelated pharmaceutical issues, in vitro and in vivo instability, immunogenicity and shorter half-lives. Novel drug modifications for overcoming these issues are under investigation and include covalent attachment of poly(ethylene glycol) (PEG), polysialic acid, or glycolic acid, as well as developing new formulations containing nanoparticulate or colloidal systems (e.g. liposomes, polymeric microspheres, polymeric nanoparticles). Such strategies have the potential to develop as next generation protein therapeutics. This review includes a general discussion on these delivery approaches. PMID:20049941

  2. Therapeutic Antioxidant Medical Gas

    PubMed Central

    Nakao, Atsunori; Sugimoto, Ryujiro; Billiar, Timothy R; McCurry, Kenneth R

    2009-01-01

    Medical gases are pharmaceutical gaseous molecules which offer solutions to medical needs and include traditional gases, such as oxygen and nitrous oxide, as well as gases with recently discovered roles as biological messenger molecules, such as carbon monoxide, nitric oxide and hydrogen sulphide. Medical gas therapy is a relatively unexplored field of medicine; however, a recent increasing in the number of publications on medical gas therapies clearly indicate that there are significant opportunities for use of gases as therapeutic tools for a variety of disease conditions. In this article, we review the recent advances in research on medical gases with antioxidant properties and discuss their clinical applications and therapeutic properties. PMID:19177183

  3. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  4. Developing Therapeutic Listening

    ERIC Educational Resources Information Center

    Lee, Billy; Prior, Seamus

    2013-01-01

    We present an experience-near account of the development of therapeutic listening in first year counselling students. A phenomenological approach was employed to articulate the trainees' lived experiences of their learning. Six students who had just completed a one-year postgraduate certificate in counselling skills were interviewed and the…

  5. Measuring Therapeutic Effectiveness.

    ERIC Educational Resources Information Center

    Callister, Sheldon L.

    In the recent past, there has been a great deal of effort directed toward developing techniques for documenting therapeutic outcome. Funding sources and the general public seem to be demanding more meaningful data which indicate, in a clear manner, whether or not the services they are paying for are of value. Mental health centers, like other…

  6. Antibody Therapeutics in Oncology

    PubMed Central

    Wold, Erik D; Smider, Vaughn V; Felding, Brunhilde H

    2016-01-01

    One of the newer classes of targeted cancer therapeutics is monoclonal antibodies. Monoclonal antibody therapeutics are a successful and rapidly expanding drug class due to their high specificity, activity, favourable pharmacokinetics, and standardized manufacturing processes. Antibodies are capable of recruiting the immune system to attack cancer cells through complement-dependent cytotoxicity or antibody dependent cellular cytotoxicity. In an ideal scenario the initial tumor cell destruction induced by administration of a therapeutic antibody can result in uptake of tumor associated antigens by antigen-presenting cells, establishing a prolonged memory effect. Mechanisms of direct tumor cell killing by antibodies include antibody recognition of cell surface bound enzymes to neutralize enzyme activity and signaling, or induction of receptor agonist or antagonist activity. Both approaches result in cellular apoptosis. In another and very direct approach, antibodies are used to deliver drugs to target cells and cause cell death. Such antibody drug conjugates (ADCs) direct cytotoxic compounds to tumor cells, after selective binding to cell surface antigens, internalization, and intracellular drug release. Efficacy and safety of ADCs for cancer therapy has recently been greatly advanced based on innovative approaches for site-specific drug conjugation to the antibody structure. This technology enabled rational optimization of function and pharmacokinetics of the resulting conjugates, and is now beginning to yield therapeutics with defined, uniform molecular characteristics, and unprecedented promise to advance cancer treatment. PMID:27081677

  7. Model-Based Reasoning in Humans Becomes Automatic with Training.

    PubMed

    Economides, Marcos; Kurth-Nelson, Zeb; Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J

    2015-09-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  8. Model-based HSF using by target point control function

    NASA Astrophysics Data System (ADS)

    Kim, Seongjin; Do, Munhoe; An, Yongbae; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu

    2015-03-01

    As the technology node shrinks, ArF Immersion reaches the limitation of wafer patterning, furthermore weak point during the mask processing is generated easily. In order to make strong patterning result, the design house conducts lithography rule checking (LRC). Despite LRC processing, we found the weak point at the verification stage of optical proximity correction (OPC). It is called the hot spot point (HSP). In order to fix the HSP, many studies have been performed. One of the most general hot spot fixing (HSF) methods is that the modification bias which consists of "Line-Resizing" and "Space-Resizing". In addition to the general rule biasing method, resolution enhancement techniques (RET) which includes the inverse lithography technology (ILT) and model based assist feature (MBAF) have been adapted to remove the hot spot and to maximize the process window. If HSP is found during OPC verification stage, various HSF methods can be applied. However, HSF process added on regular OPC procedure makes OPC turn-around time (TAT) increased. In this paper, we introduce a new HSF method that is able to make OPC TAT shorter than the common HSF method. The new HSF method consists of two concepts. The first one is that OPC target point is controlled to fix HSP. Here, the target point should be moved to optimum position at where the edge placement error (EPE) can be 0 at critical points. Many parameters such as a model accuracy or an OPC recipe become the cause of larger EPE. The second one includes controlling of model offset error through target point adjustment. Figure 1 shows the case EPE is not 0. It means that the simulation contour was not targeted well after OPC process. On the other hand, Figure 2 shows the target point is moved -2.5nm by using target point control function. As a result, simulation contour is matched to the original layout. This function can be powerfully adapted to OPC procedure of memory and logic devices.

  9. Clinical Utility and Safety of a Model-Based Patient-Tailored Dose of Vancomycin in Neonates.

    PubMed

    Leroux, Stéphanie; Jacqz-Aigrain, Evelyne; Biran, Valérie; Lopez, Emmanuel; Madeleneau, Doriane; Wallon, Camille; Zana-Taïeb, Elodie; Virlouvet, Anne-Laure; Rioualen, Stéphane; Zhao, Wei

    2016-04-01

    Pharmacokinetic modeling has often been applied to evaluate vancomycin pharmacokinetics in neonates. However, clinical application of the model-based personalized vancomycin therapy is still limited. The objective of the present study was to evaluate the clinical utility and safety of a model-based patient-tailored dose of vancomycin in neonates. A model-based vancomycin dosing calculator, developed from a population pharmacokinetic study, has been integrated into the routine clinical care in 3 neonatal intensive care units (Robert Debré, Cochin Port Royal, and Clocheville hospitals) between 2012 and 2014. The target attainment rate, defined as the percentage of patients with a first therapeutic drug monitoring serum vancomycin concentration achieving the target window of 15 to 25 mg/liter, was selected as an endpoint for evaluating the clinical utility. The safety evaluation was focused on nephrotoxicity. The clinical application of the model-based patient-tailored dose of vancomycin has been demonstrated in 190 neonates. The mean (standard deviation) gestational and postnatal ages of the study population were 31.1 (4.9) weeks and 16.7 (21.7) days, respectively. The target attainment rate increased from 41% to 72% without any case of vancomycin-related nephrotoxicity. This proof-of-concept study provides evidence for integrating model-based antimicrobial therapy in neonatal routine care.

  10. Overcoming limitations of model-based diagnostic reasoning systems

    NASA Technical Reports Server (NTRS)

    Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.

    1989-01-01

    The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.

  11. Voltage correction power flow

    SciTech Connect

    Rajicic, D.; Ackovski, R.; Taleski, R. . Dept. of Electrical Engineering)

    1994-04-01

    A method for power flow solution of weakly meshed distribution and transmission networks is presented. It is based on oriented ordering of network elements. That allows an efficient construction of the loop impedance matrix and rational organization of the processes such as: power summation (backward sweep), current summation (backward sweep) and node voltage calculation (forward sweep). The first step of the algorithm is calculation of node voltages on the radial part of the network. The second step is calculation of the breakpoint currents. Then, the procedure continues with the first step, which is preceded by voltage correction. It is illustrated that using voltage correction approach, the iterative process of weakly meshed network voltage calculation is faster and more reliable.

  12. Correction coil cable

    DOEpatents

    Wang, Sou-Tien

    1994-11-01

    A wire cable assembly (10, 310) adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies (532) for the superconducting super collider. The correction coil cables (10, 310) have wires (14, 314) collected in wire arrays (12, 312) with a center rib (16, 316) sandwiched therebetween to form a core assembly (18, 318 ). The core assembly (18, 318) is surrounded by an assembly housing (20, 320) having an inner spiral wrap (22, 322) and a counter wound outer spiral wrap (24, 324). An alternate embodiment (410) of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable (410) on a particle tube (733) in a particle tube assembly (732).

  13. Correcting Duporcq's theorem☆

    PubMed Central

    Nawratil, Georg

    2014-01-01

    In 1898, Ernest Duporcq stated a famous theorem about rigid-body motions with spherical trajectories, without giving a rigorous proof. Today, this theorem is again of interest, as it is strongly connected with the topic of self-motions of planar Stewart–Gough platforms. We discuss Duporcq's theorem from this point of view and demonstrate that it is not correct. Moreover, we also present a revised version of this theorem. PMID:25540467

  14. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  15. [New therapeutic developments in cystic fibrosis].

    PubMed

    Bui, S; Macey, J; Fayon, M; Bihouée, T; Burgel, P-R; Colomb, V; Corvol, H; Durieu, I; Hubert, D; Marguet, C; Mas, E; Munck, A; Murris-Espin, M; Reix, P; Sermet-Gaudelus, I

    2016-12-01

    Since the discovery of chloride secretion by the Cystic Fibrosis Transport regulator CFTR in 1983, and CFTR gene in 1989, knowledge about CFTR synthesis, maturation, intracellular transfer and function has dramatically expanded. These discoveries have led to the distribution of CF mutations into 6 classes with different pathophysiological mechanisms. In this article we will explore the state of art on CFTR synthesis and its chloride secretion function. We will then explore the consequences of the 6 classes of mutations on CFTR protein function and we will describe the new therapeutic developments aiming at correcting these defects.

  16. Model Based Iterative Reconstruction for Bright Field Electron Tomography (Postprint)

    DTIC Science & Technology

    2013-02-01

    Reconstruction Technique ( SIRT ) are applied to the data. Model based iterative reconstruction (MBIR) provides a powerful framework for tomographic...the reconstruction when the typical algorithms such as Filtered Back Projection (FBP) and Simultaneous Iterative Reconstruction Technique ( SIRT ) are

  17. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  18. Multistage vector (MSV) therapeutics.

    PubMed

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-12-10

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers.

  19. Multistage vector (MSV) therapeutics

    PubMed Central

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-01-01

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers. PMID:26264836

  20. Therapeutic Hypothermia for Neuroprotection

    PubMed Central

    Karnatovskaia, Lioudmila V.; Wartenberg, Katja E.

    2014-01-01

    The earliest recorded application of therapeutic hypothermia in medicine spans about 5000 years; however, its use has become widespread since 2002, following the demonstration of both safety and efficacy of regimens requiring only a mild (32°C-35°C) degree of cooling after cardiac arrest. We review the mechanisms by which hypothermia confers neuroprotection as well as its physiological effects by body system and its associated risks. With regard to clinical applications, we present evidence on the role of hypothermia in traumatic brain injury, intracranial pressure elevation, stroke, subarachnoid hemorrhage, spinal cord injury, hepatic encephalopathy, and neonatal peripartum encephalopathy. Based on the current knowledge and areas undergoing or in need of further exploration, we feel that therapeutic hypothermia holds promise in the treatment of patients with various forms of neurologic injury; however, additional quality studies are needed before its true role is fully known. PMID:24982721

  1. Therapeutic cancer vaccines

    PubMed Central

    Melief, Cornelis J.M.; van Hall, Thorbald; Arens, Ramon; Ossendorp, Ferry; van der Burg, Sjoerd H.

    2015-01-01

    The clinical benefit of therapeutic cancer vaccines has been established. Whereas regression of lesions was shown for premalignant lesions caused by HPV, clinical benefit in cancer patients was mostly noted as prolonged survival. Suboptimal vaccine design and an immunosuppressive cancer microenvironment are the root causes of the lack of cancer eradication. Effective cancer vaccines deliver concentrated antigen to both HLA class I and II molecules of DCs, promoting both CD4 and CD8 T cell responses. Optimal vaccine platforms include DNA and RNA vaccines and synthetic long peptides. Antigens of choice include mutant sequences, selected cancer testis antigens, and viral antigens. Drugs or physical treatments can mitigate the immunosuppressive cancer microenvironment and include chemotherapeutics, radiation, indoleamine 2,3-dioxygenase (IDO) inhibitors, inhibitors of T cell checkpoints, agonists of selected TNF receptor family members, and inhibitors of undesirable cytokines. The specificity of therapeutic vaccination combined with such immunomodulation offers an attractive avenue for the development of future cancer therapies. PMID:26214521

  2. Reduced model-based decision-making in schizophrenia.

    PubMed

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  3. Antioxidant therapeutics for schizophrenia.

    PubMed

    Reddy, Ravinder; Reddy, Rajiv

    2011-10-01

    Pharmaceutical treatment for millions worldwide who have schizophrenia is limited to a handful of antipsychotics. Despite the proven efficacy of these drugs, the overall outcome for schizophrenia remains suboptimal. Thus, alternative treatment options are urgently needed. One possible approach may be antioxidant therapy. The extant evidence for the role of oxidative stress in the pathophysiology of schizophrenia offers a hypothesis-derived therapeutic approach in the form of antioxidants. Vitamins C and E, for example, are suitable for human clinical trials because they are readily available, inexpensive, and relatively safe. Research into the therapeutic use of antioxidants in schizophrenia can be grouped into two main clusters: for psychopathology and for side effects. Of these studies, some have been carefully conducted, but majority are open label. Use of antioxidants for treatment-related side effects has been more extensively investigated. The totality of the evidence to date suggests that specific antioxidants, such as N-acetyl cysteine, may offer tangible benefits for the clinical syndrome of schizophrenia, and vitamin E may offer salutary effects on glycemic effects of antipsychotics. However, a great deal of fundamental clinical research remains to be done before antioxidants can be routinely used therapeutically for schizophrenia and treatment-related complications.

  4. Polycyclic peptide therapeutics.

    PubMed

    Baeriswyl, Vanessa; Heinis, Christian

    2013-03-01

    Owing to their excellent binding properties, high stability, and low off-target toxicity, polycyclic peptides are an attractive molecule format for the development of therapeutics. Currently, only a handful of polycyclic peptides are used in the clinic; examples include the antibiotic vancomycin, the anticancer drugs actinomycin D and romidepsin, and the analgesic agent ziconotide. All clinically used polycyclic peptide drugs are derived from natural sources, such as soil bacteria in the case of vancomycin, actinomycin D and romidepsin, or the venom of a fish-hunting coil snail in the case of ziconotide. Unfortunately, nature provides peptide macrocyclic ligands for only a small fraction of therapeutic targets. For the generation of ligands of targets of choice, researchers have inserted artificial binding sites into natural polycyclic peptide scaffolds, such as cystine knot proteins, using rational design or directed evolution approaches. More recently, large combinatorial libraries of genetically encoded bicyclic peptides have been generated de novo and screened by phage display. In this Minireview, the properties of existing polycyclic peptide drugs are discussed and related to their interesting molecular architectures. Furthermore, technologies that allow the development of unnatural polycyclic peptide ligands are discussed. Recent application of these technologies has generated promising results, suggesting that polycyclic peptide therapeutics could potentially be developed for a broad range of diseases.

  5. Proteases as therapeutics

    PubMed Central

    Craik, Charles S.; Page, Michael J.; Madison, Edwin L.

    2015-01-01

    Proteases are an expanding class of drugs that hold great promise. The U.S. FDA (Food and Drug Administration) has approved 12 protease therapies, and a number of next generation or completely new proteases are in clinical development. Although they are a well-recognized class of targets for inhibitors, proteases themselves have not typically been considered as a drug class despite their application in the clinic over the last several decades; initially as plasma fractions and later as purified products. Although the predominant use of proteases has been in treating cardiovascular disease, they are also emerging as useful agents in the treatment of sepsis, digestive disorders, inflammation, cystic fibrosis, retinal disorders, psoriasis and other diseases. In the present review, we outline the history of proteases as therapeutics, provide an overview of their current clinical application, and describe several approaches to improve and expand their clinical application. Undoubtedly, our ability to harness proteolysis for disease treatment will increase with our understanding of protease biology and the molecular mechanisms responsible. New technologies for rationally engineering proteases, as well as improved delivery options, will expand greatly the potential applications of these enzymes. The recognition that proteases are, in fact, an established class of safe and efficacious drugs will stimulate investigation of additional therapeutic applications for these enzymes. Proteases therefore have a bright future as a distinct therapeutic class with diverse clinical applications. PMID:21406063

  6. Therapeutic antibody engineering

    PubMed Central

    Parren, Paul W.H.I.; Lugovskoy, Alexey A.

    2013-01-01

    It is an important event in any knowledge area when an authority in the field decides that it is time to share all accumulated knowledge and learnings by writing a text book. This does not occur often in the biopharmaceutical industry, likely due to both the highly dynamic environment with tight timelines and policies and procedures at many pharmaceutical companies that hamper knowledge sharing. To take on a task like this successfully, a strong drive combined with a desire and talent to teach, but also an accommodating and stimulating environment is required. Luckily for those interested in therapeutic monoclonal antibodies, Dr. William R. Strohl decided about two years ago that the time was right to write a book about the past, present and future of these fascinating molecules. Dr. Strohl’s great expertise and passion for biotechnology is evident from his life story and his strong academic and industry track record. Dr. Strohl pioneered natural product biotechnology, first in academia as a full professor of microbiology and biochemistry at Ohio State University in Columbus, Ohio and later in industry while at Merck. Despite his notable advances in recombinant natural products, industry interest in this area waned and in 2001 Dr. Strohl sought new opportunities by entering the field of antibody therapeutics. He initiated antibody discovery through phage display at Merck, and then moved to Centocor Research and Development Inc. (now Janssen Biotech, Inc.) in 2008 to head Biologics Research, where he now directs the discovery of innovative therapeutic antibody candidates.

  7. Attenuation correction in molecular fluorescence imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yang, Bin; Tunnell, James W.

    2016-03-01

    Fluorescence-guided surgery has demonstrated more complete tumor resections in both preclinical models and clinical applications. However, intraoperative fluorescence-based imaging can be challenging due to attenuation of the fluorescence by intrinsic tissue scattering and absorption. Removing attenuation in fluorescence imaging is critical in many applications. We have developed both a model based approach and an experimental approach to retrieve attenuation corrected fluorescence based on spatial frequency domain imaging (SFDI). In the model based approach, we extended an attenuation correction model initially developed for point measurement into wide-field imaging with SFDI. To achieve attenuation correction, tissue optical properties were evaluated at both excitation and emission wavelengths, which were later applied in the model. In an in-vitro phantom study, we achieved a relative flat intensity profile over entire absorption range compared to over 80% drop at the highest absorption level before correction. Similar performance was also observed in an ex-vivo tissue study. However, lengthy image acquisition and image processing make this method ideal for static imaging instead of video-rate imaging. To achieve video-rate correction, we developed an experimental approach to reduce absorption by limiting the imaging depth using a high spatial frequency pattern. The absorption reduced fluorescence image was obtained by performing a simple demodulation. The in-vitro phantom study suggested an approximate 20% intensity drop at the highest absorption level compared to over 70% intensity drop before correction. This approach enabled video-rate attenuation corrected imaging at 19 fps, making this technique viable for clinical image guided surgery.

  8. Biasing errors and corrections

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1991-01-01

    The dependence of laser velocimeter measurement rate on flow velocity is discussed. Investigations outlining that any dependence is purely statistical, and is nonstationary both spatially and temporally, are described. Main conclusions drawn are that the times between successive particle arrivals should be routinely measured and the calculation of the velocity data rate correlation coefficient should be performed to determine if a dependency exists. If none is found, accept the data ensemble as an independent sample of the flow. If a dependency is found, the data should be modified to obtain an independent sample. Universal correcting procedures should never be applied because their underlying assumptions are not valid.

  9. Using Online Annotations to Support Error Correction and Corrective Feedback

    ERIC Educational Resources Information Center

    Yeh, Shiou-Wen; Lo, Jia-Jiunn

    2009-01-01

    Giving feedback on second language (L2) writing is a challenging task. This research proposed an interactive environment for error correction and corrective feedback. First, we developed an online corrective feedback and error analysis system called "Online Annotator for EFL Writing". The system consisted of five facilities: Document Maker,…

  10. Mental Health in Corrections: An Overview for Correctional Staff.

    ERIC Educational Resources Information Center

    Sowers, Wesley; Thompson, Kenneth; Mullins, Stephen

    This volume is designed to provide corrections practitioners with basic staff training on the needs of those with mental illness and impairments in our correctional systems. Chapter titles are: (1) "Mental Illness in the Correctional Setting"; (2) "Substance Use Disorders"; (3) "Problems with Mood"; (4) "Problems…

  11. Complications of auricular correction

    PubMed Central

    Staindl, Otto; Siedek, Vanessa

    2008-01-01

    The risk of complications of auricular correction is underestimated. There is around a 5% risk of early complications (haematoma, infection, fistulae caused by stitches and granulomae, allergic reactions, pressure ulcers, feelings of pain and asymmetry in side comparison) and a 20% risk of late complications (recurrences, telehone ear, excessive edge formation, auricle fitting too closely, narrowing of the auditory canal, keloids and complete collapse of the ear). Deformities are evaluated less critically by patients than by the surgeons, providing they do not concern how the ear is positioned. The causes of complications and deformities are, in the vast majority of cases, incorrect diagnosis and wrong choice of operating procedure. The choice of operating procedure must be adapted to suit the individual ear morphology. Bandaging technique and inspections and, if necessary, early revision are of great importance for the occurence and progress of early complications, in addition to operation techniques. In cases of late complications such as keloids and auricles that are too closely fitting, unfixed full-thickness skin flaps have proved to be the most successful. Large deformities can often only be corrected to a limited degree of satisfaction. PMID:22073079

  12. Smooth eigenvalue correction

    NASA Astrophysics Data System (ADS)

    Hendrikse, Anne; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-12-01

    Second-order statistics play an important role in data modeling. Nowadays, there is a tendency toward measuring more signals with higher resolution (e.g., high-resolution video), causing a rapid increase of dimensionality of the measured samples, while the number of samples remains more or less the same. As a result the eigenvalue estimates are significantly biased as described by the Marčenko Pastur equation for the limit of both the number of samples and their dimensionality going to infinity. By introducing a smoothness factor, we show that the Marčenko Pastur equation can be used in practical situations where both the number of samples and their dimensionality remain finite. Based on this result we derive methods, one already known and one new to our knowledge, to estimate the sample eigenvalues when the population eigenvalues are known. However, usually the sample eigenvalues are known and the population eigenvalues are required. We therefore applied one of the these methods in a feedback loop, resulting in an eigenvalue bias correction method. We compare this eigenvalue correction method with the state-of-the-art methods and show that our method outperforms other methods particularly in real-life situations often encountered in biometrics: underdetermined configurations, high-dimensional configurations, and configurations where the eigenvalues are exponentially distributed.

  13. Contact Lenses for Vision Correction

    MedlinePlus

    ... Ophthalmologist Patient Stories Español Eye Health / Glasses & Contacts Contact Lenses Sections Contact Lenses for Vision Correction Proper ... to Know About Contact Lenses Colored Contact Lenses Contact Lenses for Vision Correction Written by: Kierstan Boyd ...

  14. 75 FR 16516 - Dates Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-01

    ... From the Federal Register Online via the Government Publishing Office ] NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Office of the Federal Register Dates Correction Correction In the Notices section beginning on page 15401 in the issue of March 29th, 2010, make the following correction: On pages...

  15. Yearbook of Correctional Education 1989.

    ERIC Educational Resources Information Center

    Duguid, Stephen, Ed.

    This yearbook contains conference papers, commissioned papers, reprints of earlier works, and research-in-progress. They offer a retrospective view as well as address the mission and perspective of correctional education, its international dimension, correctional education in action, and current research. Papers include "Correctional Education and…

  16. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  17. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  18. [Corrected transposition of the great arteries].

    PubMed

    Alva-Espinosa, Carlos

    2016-01-01

    Corrected transposition of the great arteries is one of the most fascinating entities in congenital heart disease. The apparent corrected condition is only temporal. Over time, most patients develop systemic heart failure, even in the absence of associated lesions. With current imaging studies, precise visualization is achieved in each case though the treatment strategy remains unresolved. In asymptomatic patients or cases without associated lesions, focalized follow-up to assess systemic ventricular function and the degree of tricuspid valve regurgitation is important. In cases with normal ventricular function and mild tricuspid failure, it seems unreasonable to intervene surgically. In patients with significant associated lesions, surgery is indicated. In the long term, the traditional approach may not help tricuspid regurgitation and systemic ventricular failure. Anatomical correction is the proposed alternative to ease the right ventricle overload and to restore the systemic left ventricular function. However, this is a prolonged operation and not without risks and long-term complications. In this review the clinical, diagnostic, and therapeutic aspects are overviewed in the light of the most significant and recent literature.

  19. The Therapeutic Roller Coaster

    PubMed Central

    CHU, JAMES A.

    1992-01-01

    Survivors of severe childhood abuse often encounter profound difficulties. In addition to posttraumatic and dissociative symptomatology, abuse survivors frequently have characterologic problems, particularly regarding self-care and maintaining relationships. Backgrounds of abuse, abandonment, and betrayal are often recapitulated and reenacted in therapy, making the therapeutic experience arduous and confusing for therapists and patients. Efforts must be directed at building an adequate psychotherapeutic foundation before undertaking exploration and abreaction of past traumatic experiences. This discussion sets out a model for treatment of childhood abuse survivors, describing stages of treatment and suggesting interventions. Common treatment dilemmas or "traps" are discussed, with recommendations for their resolution. PMID:22700116

  20. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  1. Anomaly corrected heterotic horizons

    NASA Astrophysics Data System (ADS)

    Fontanella, A.; Gutowski, J. B.; Papadopoulos, G.

    2016-10-01

    We consider supersymmetric near-horizon geometries in heterotic supergravity up to two loop order in sigma model perturbation theory. We identify the conditions for the horizons to admit enhancement of supersymmetry. We show that solutions which undergo supersymmetry enhancement exhibit an {s}{l}(2,{R}) symmetry, and we describe the geometry of their horizon sections. We also prove a modified Lichnerowicz type theorem, incorporating α' corrections, which relates Killing spinors to zero modes of near-horizon Dirac operators. Furthermore, we demonstrate that there are no AdS2 solutions in heterotic supergravity up to second order in α' for which the fields are smooth and the internal space is smooth and compact without boundary. We investigate a class of nearly supersymmetric horizons, for which the gravitino Killing spinor equation is satisfied on the spatial cross sections but not the dilatino one, and present a description of their geometry.

  2. Updating and correction.

    PubMed

    1994-09-09

    The current editions of two books edited by William T. Golden, Science Advice to the President and Science and Technology Advice to the President, Congress, and Judiciary, published this year by AAAS Press, are now being distributed by Transaction Publishers, New Brunswick, NJ 08903, at the prices $22.95 and $27.95 (paper), respectively, and are no longer available from AAAS. A related work, Golden's 1991 compilation Worldwide Science and Technology Advice to the Highest Levels of Government, originally published by Pergamon Press, is also being distributed by Transaction Publishers, at $25.95. For more information about the books see Science 1 July, p. 127. In the review of K. S. Thorne's Black Holes and Time Warps (13 May, p. 999-1000), the captions and illustrations on p. 1000 were mismatched. The correct order of the captions is (i) "A heavy rock..."; (ii) "Cosmic radio waves..."; and (iii) "The trajectories in space...."

  3. EDITORIAL: Politically correct physics?

    NASA Astrophysics Data System (ADS)

    Pople Deputy Editor, Stephen

    1997-03-01

    If you were a caring, thinking, liberally minded person in the 1960s, you marched against the bomb, against the Vietnam war, and for civil rights. By the 1980s, your voice was raised about the destruction of the rainforests and the threat to our whole planetary environment. At the same time, you opposed discrimination against any group because of race, sex or sexual orientation. You reasoned that people who spoke or acted in a discriminatory manner should be discriminated against. In other words, you became politically correct. Despite its oft-quoted excesses, the political correctness movement sprang from well-founded concerns about injustices in our society. So, on balance, I am all for it. Or, at least, I was until it started to invade science. Biologists were the first to feel the impact. No longer could they refer to 'higher' and 'lower' orders, or 'primitive' forms of life. To the list of undesirable 'isms' - sexism, racism, ageism - had been added a new one: speciesism. Chemists remained immune to the PC invasion, but what else could you expect from a group of people so steeped in tradition that their principal unit, the mole, requires the use of the thoroughly unreconstructed gram? Now it is the turn of the physicists. This time, the offenders are not those who talk disparagingly about other people or animals, but those who refer to 'forms of energy' and 'heat'. Political correctness has evolved into physical correctness. I was always rather fond of the various forms of energy: potential, kinetic, chemical, electrical, sound and so on. My students might merge heat and internal energy into a single, fuzzy concept loosely associated with moving molecules. They might be a little confused at a whole new crop of energies - hydroelectric, solar, wind, geothermal and tidal - but they could tell me what devices turned chemical energy into electrical energy, even if they couldn't quite appreciate that turning tidal energy into geothermal energy wasn't part of the

  4. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  5. Relativistic quantum corrections to laser wakefield acceleration.

    PubMed

    Zhu, Jun; Ji, Peiyong

    2010-03-01

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons.

  6. Relativistic quantum corrections to laser wakefield acceleration

    SciTech Connect

    Zhu Jun; Ji Peiyong

    2010-03-15

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons.

  7. Mechanisms of Plasma Therapeutics

    NASA Astrophysics Data System (ADS)

    Graves, David

    2015-09-01

    In this talk, I address research directed towards biomedical applications of atmospheric pressure plasma such as sterilization, surgery, wound healing and anti-cancer therapy. The field has seen remarkable growth in the last 3-5 years, but the mechanisms responsible for the biomedical effects have remained mysterious. It is known that plasmas readily create reactive oxygen species (ROS) and reactive nitrogen species (RNS). ROS and RNS (or RONS), in addition to a suite of other radical and non-radical reactive species, are essential actors in an important sub-field of aerobic biology termed ``redox'' (or oxidation-reduction) biology. It is postulated that cold atmospheric plasma (CAP) can trigger a therapeutic shielding response in tissue in part by creating a time- and space-localized, burst-like form of oxy-nitrosative stress on near-surface exposed cells through the flux of plasma-generated RONS. RONS-exposed surface layers of cells communicate to the deeper levels of tissue via a form of the ``bystander effect,'' similar to responses to other forms of cell stress. In this proposed model of CAP therapeutics, the plasma stimulates a cellular survival mechanism through which aerobic organisms shield themselves from infection and other challenges.

  8. Therapeutic endoscopy in gastroenterology.

    PubMed

    Celiński, K; Cichoz-Lach, H

    2007-08-01

    The role of therapeutic endoscopy in current gastroenterology is very important. Therapuetic endoscopy is useful in treatment of gastrointestinal bleeding. Endoscopic control of gastrointestinal bleeding includes the following procedures of haemostasis techniques: photocoagulation, electrocoagulation, thermocoagulation and injection method. Owing to these procedures mortality has significantly decreased. Endoscopic hemostasis eliminates the risk of surgery, is less expensive and better tolerated by patients. Colonoscopic polypectomy is a widely used technique. By removal of polyps the incidence of colon cancer can be decreased. The "hot biopsy" forceps can be used to excise polyps of up to 6 mm. Larger polyps can be removed safely by snare electrocautery and retrieved for histologic study. Endoscopic retrograde cholangiopancreatography has a therapeutic application designed to cut the sphincter of Oddi fibers of the distal common bile duct, what is indicated currently in choledocholithiasis and papillary stenosis with ascending cholangitis, acute gallstone pancreatitis. Endoscopic sphincterotomy in now an established procedure that is indicated in patients with common bile duct calculi. Endoscopic decompression of the biliary tree - dilatation benign structures of the biliary tree with baloon catheters and placement an internal endoprothesis allows the nonoperative decompression and significant palliation for patients with obstructing tumors.

  9. Person-centered Therapeutics

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    A clinician’s effectiveness in treatment depends substantially on his or her attitude toward -- and understanding of -- the patient as a person endowed with self-awareness and the will to direct his or her own future. The assessment of personality in the therapeutic encounter is a crucial foundation for forming an effective working alliance with shared goals. Helping a person to reflect on their personality provides a mirror image of their strengths and weaknesses in adapting to life’s many challenges. The Temperament and Character Inventory (TCI) provides an effective way to describe personality thoroughly and to predict both the positive and negative aspects of health. Strengths and weaknesses in TCI personality traits allow strong predictions of individual differences of all aspects of well-being. Diverse therapeutic techniques, such as diet, exercise, mood self-regulation, meditation, or acts of kindness, influence health and personality development in ways that are largely indistinguishable from one another or from effective allopathic treatments. Hence the development of well-being appears to be the result of activating a synergistic set of mechanisms of well-being, which are expressed as fuller functioning, plasticity, and virtue in adapting to life’s challenges PMID:26052429

  10. Engineering therapeutic protein disaggregases

    PubMed Central

    Shorter, James

    2016-01-01

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson’s disease (PD), and Alzheimer’s disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  11. Nitrones as Therapeutics

    PubMed Central

    Floyd, Robert A.; Kopke, Richard D.; Choi, Chul-Hee; Foster, Steven B.; Doblas, Sabrina; Towner, Rheal A.

    2008-01-01

    Nitrones have the general chemical formula X-CH=NO-Y. They were first used to trap free radicals in chemical systems and then subsequently in biochemical systems. More recently several nitrones including PBN (α-phenyl-tert-butylnitrone) have been shown to have potent biological activity in many experimental animal models. Many diseases of aging including stroke, cancer development, Parkinson’s disease and Alzheimer’s disease are known to have enhanced levels of free radicals and oxidative stress. Some derivatives of PBN are significantly more potent than PBN and have undergone extensive commercial development in stroke. Recent research has shown that PBN-related nitrones also have anti-cancer activity in several experimental cancer models and have potential as therapeutics in some cancers. Also in recent observations nitrones have been shown to act synergistically in combination with antioxidants in the prevention of acute acoustic noise induced hearing loss. The mechanistic basis of the potent biological activity of PBN-related nitrones is not known. Even though PBN-related nitrones do decrease oxidative stress and oxidative damage, their potent biological anti-inflammatory activity and their ability to alter cellular signaling processes can not readily be explained by conventional notions of free radical trapping biochemistry. This review is focused on our observations and others where the use of selected nitrones as novel therapeutics have been evaluated in experimental models in the context of free radical biochemical and cellular processes considered important in pathologic conditions and age-related diseases. PMID:18793715

  12. The impact of a model-based clinical regional registry for attention-deficit hyperactivity disorder.

    PubMed

    Zanetti, Michele; Cartabia, Massimo; Didoni, Anna; Fortinguerra, Filomena; Reale, Laura; Mondini, Matteo; Bonati, Maurizio

    2016-03-17

    This article describes the development and clinical impact of the Italian Regional ADHD Registry, aimed at collecting and monitoring diagnostic and therapeutic pathways of care for attention-deficit hyperactivity disorder children and adolescents, launched by the Italian Lombardy Region in June 2011. In particular, the model-based software used to run the registry and manage clinical care data acquisition and monitoring, is described. This software was developed using the PROSAFE programme, which is already used for data collection in many Italian intensive care units, as a stand-alone interface case report form. The use of the attention-deficit hyperactivity disorder regional registry led to an increase in the appropriateness of the clinical management of all patients included in the registry, proving to be an important instrument in ensuring an appropriate healthcare strategy for children and adolescents with attention-deficit/hyperactivity disorder.

  13. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  14. Model-based hierarchical reinforcement learning and human action control.

    PubMed

    Botvinick, Matthew; Weinstein, Ari

    2014-11-05

    Recent work has reawakened interest in goal-directed or 'model-based' choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour.

  15. Combination therapeutics in complex diseases.

    PubMed

    He, Bing; Lu, Cheng; Zheng, Guang; He, Xiaojuan; Wang, Maolin; Chen, Gao; Zhang, Ge; Lu, Aiping

    2016-12-01

    The biological redundancies in molecular networks of complex diseases limit the efficacy of many single drug therapies. Combination therapeutics, as a common therapeutic method, involve pharmacological intervention using several drugs that interact with multiple targets in the molecular networks of diseases and may achieve better efficacy and/or less toxicity than monotherapy in practice. The development of combination therapeutics is complicated by several critical issues, including identifying multiple targets, targeting strategies and the drug combination. This review summarizes the current achievements in combination therapeutics, with a particular emphasis on the efforts to develop combination therapeutics for complex diseases.

  16. Management of antipsychotic treatment discontinuation and interruptions using model-based simulations

    PubMed Central

    Samtani, Mahesh N; Sheehan, John J; Fu, Dong-Jing; Remmerie, Bart; Sliwa, Jennifer Kern; Alphs, Larry

    2012-01-01

    Background Medication nonadherence is a well described and prevalent clinical occurrence in schizophrenia. These pharmacokinetic model-based simulations analyze predicted antipsychotic plasma concentrations in nonadherence and treatment interruption scenarios and with treatment reinitiation. Methods Starting from steady state, pharmacokinetic model-based simulations of active moiety plasma concentrations of oral, immediate-release risperidone 3 mg/day, risperidone long-acting injection 37.5 mg/14 days, oral paliperidone extended-release 6 mg/day, and paliperidone palmitate 117 mg (75 mg equivalents)/28 days were assessed under three treatment discontinuation/interruption scenarios, ie, complete discontinuation, one week of interruption, and four weeks of interruption. In the treatment interruption scenarios, pharmacokinetic simulations were performed using medication-specific reinitiation strategies. Results Following complete treatment discontinuation, plasma concentrations persisted longest with paliperidone palmitate, followed by risperidone long-acting injection, while oral formulations exhibited the most rapid decrease. One week of oral paliperidone or risperidone interruption resulted in near complete elimination from the systemic circulation within that timeframe, reflecting the rapid elimination rate of the active moiety. After 1 and 4 weeks of interruption, minimum plasma concentrations were higher with paliperidone palmitate than risperidone long-acting injection over the simulated period. Four weeks of treatment interruption followed by reinitiation resulted in plasma levels returning to predicted therapeutic levels within 1 week. Conclusion Due to the long half-life of paliperidone palmitate (25–49 days), putative therapeutic plasma concentrations persisted longest in simulated cases of complete discontinuation or treatment interruption. These simulations may help clinicians better conceptualize the impact of antipsychotic nonadherence on plasma

  17. Extending model-based diagnosis for analog thermodynamical devices

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas; Chien, Steve; Robertson, Charles

    1993-01-01

    The increasing complexity of process control applications have posed difficult problems in fault detection, isolation, and recovery. Deep knowledge-based approaches, such as model-based diagnosis, have offered some promise in addressing these problems. However, the difficulties of adapting these techniques to situations involving numerical reasoning and noise have limited the applicability of these techniques. This paper describes an extension of classical model-based diagnosis techniques to deal with sparse data, noise, and complex noninvertible numerical models. These diagnosis techniques are being applied to the External Active Thermal Control System for Space Station Freedom.

  18. A Model Based Mars Climate Database for the Mission Design

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A viewgraph presentation on a model based climate database is shown. The topics include: 1) Why a model based climate database?; 2) Mars Climate Database v3.1 Who uses it ? (approx. 60 users!); 3) The new Mars Climate database MCD v4.0; 4) MCD v4.0: what's new ? 5) Simulation of Water ice clouds; 6) Simulation of Water ice cycle; 7) A new tool for surface pressure prediction; 8) Acces to the database MCD 4.0; 9) How to access the database; and 10) New web access

  19. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  20. Cascaded process model based control: packed absorption column application.

    PubMed

    Govindarajan, Anand; Jayaraman, Suresh Kumar; Sethuraman, Vijayalakshmi; Raul, Pramod R; Rhinehart, R Russell

    2014-03-01

    Nonlinear, adaptive, process-model based control is demonstrated in a cascaded single-input-single-output mode for pressure drop control in a pilot-scale packed absorption column. The process is shown to be nonlinear. Control is demonstrated in both servo and regulatory modes, for no wind-up in a constrained situation, and for bumpless transfer. Model adaptation is demonstrated and shown to provide process insight. The application procedure is revealed as a design guide to aid others in implementing process-model based control.

  1. Paediatric models in motion: requirements for model-based decision support at the bedside

    PubMed Central

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  2. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    PubMed

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics.

  3. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  4. Improved model-based infrared reflectrometry for measuring deep trench structures.

    PubMed

    Zhang, Chuanwei; Liu, Shiyuan; Shi, Tielin; Tang, Zirong

    2009-11-01

    Model-based infrared reflectrometry (MBIR) has been introduced recently for characterization of high-aspect-ratio deep trench structures in microelectronics. The success of this technique relies heavily on accurate modeling of trench structures and fast extraction of trench parameters. In this paper, we propose a modeling method named corrected effective medium approximation (CEMA) for accurate and fast reflectivity calculation of deep trench structures. We also develop a method combining an artificial neural network (ANN) and a Levenberg-Marquardt (LM) algorithm for robust and fast extraction of geometric parameters from the measured reflectance spectrum. The simulation and experimental work conducted on typical deep trench structures has verified the proposed methods and demonstrated that the improved MBIR metrology achieves highly accurate measurement results as well as fast computation speed.

  5. [Research on point cloud smoothing in knee joint prosthesis modeling based on reverse engineering].

    PubMed

    Zhang, Guoliang; Yao, Jin; Wei, Xing; Pei, Fuxing; Zhou, Zongke

    2008-10-01

    At present, foreign standard knee joint prosthesis is mostly used in clinical practice; it can well represent the biological characteristic of knee joint on human being. So this paper adopts the reverse engineering technology in that connexion, presents novel positioning method of acquiring the point data on the surface of knee joint prosthesis, puts forward the algorithm of three-point angle method for removing the noise error and correcting the noise error based on the least squares plane to smooth point cloud. And then, the surface of knee joint prosthesis with better accuracy and smoothness can be generated. Finally, the knee joint prosthesis model can be generated. Thus, a basis is provided for the localization of knee joint prosthesis. This new algorithm is mainly used for the surface modeling based on point cloud smoothing, including the surface of knee joint prosthesis, the surface of regular shape, and the surface with gentle change in curvature.

  6. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil

    2012-01-01

    Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.

  7. Model-based near-wall reconstructions for immersed-boundary methods

    NASA Astrophysics Data System (ADS)

    Posa, Antonio; Balaras, Elias

    2014-08-01

    In immersed-boundary methods, the cost of resolving the thin boundary layers on a solid boundary at high Reynolds numbers is prohibitive. In the present work, we propose a new model-based, near-wall reconstruction to account for the lack of resolution and provide the correct wall shear stress and hydrodynamic forces. The models are analytical versions of a generalized version of the two-layer model developed by Balaras et al. (AIAA J 34:1111-1119, 1996) for large-eddy simulations. We will present the results for the flow around a cylinder and a sphere, where we use Cartesian and cylindrical coordinate grids. We will demonstrate that the proposed treatment reproduces very accurately the wall stress on grids, which are one order of magnitude coarser compared to well-resolved simulations.

  8. Antibody Engineering and Therapeutics

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Breden, Felix; Scott, Jamie K; Sok, Devin; Pauthner, Matthias; Reichert, Janice M; Helguera, Gustavo; Andrabi, Raiees; Mabry, Robert; Bléry, Mathieu; Voss, James E; Laurén, Juha; Abuqayyas, Lubna; Barghorn, Stefan; Ben-Jacob, Eshel; Crowe, James E; Huston, James S; Johnston, Stephen Albert; Krauland, Eric; Lund-Johansen, Fridtjof; Marasco, Wayne A; Parren, Paul WHI; Xu, Kai Y

    2014-01-01

    The 24th Antibody Engineering & Therapeutics meeting brought together a broad range of participants who were updated on the latest advances in antibody research and development. Organized by IBC Life Sciences, the gathering is the annual meeting of The Antibody Society, which serves as the scientific sponsor. Preconference workshops on 3D modeling and delineation of clonal lineages were featured, and the conference included sessions on a wide variety of topics relevant to researchers, including systems biology; antibody deep sequencing and repertoires; the effects of antibody gene variation and usage on antibody response; directed evolution; knowledge-based design; antibodies in a complex environment; polyreactive antibodies and polyspecificity; the interface between antibody therapy and cellular immunity in cancer; antibodies in cardiometabolic medicine; antibody pharmacokinetics, distribution and off-target toxicity; optimizing antibody formats for immunotherapy; polyclonals, oligoclonals and bispecifics; antibody discovery platforms; and antibody-drug conjugates. PMID:24589717

  9. Outpatient therapeutic nuclear oncology.

    PubMed

    Turner, J Harvey

    2012-05-01

    In the beginning, nuclear medicine was radionuclide therapy, which has evolved into molecular tumour-targeted control of metastatic cancer. Safe, efficacious, clinical practice of therapeutic nuclear oncology may now be based upon accurate personalised dosimetry by quantitative gamma SPECT/CT imaging to prescribe tumoricidal activities without critical organ toxicity. Preferred therapy radionuclides possess gamma emission of modest energy and abundance to enable quantitative SPECT/CT imaging for calculation of the beta therapy dosimetry, without radiation exposure risk to hospital personnel, carers, family or members of the public. The safety of outpatient radiopharmaceutical therapy of cancer with Iodine-131, Samarium-153, Holmium-166, Rhenium-186, Rhenium-188, Lutetium-177 and Indium-111 is reviewed. Measured activity release rates and radiation exposure to carers and the public are all within recommendations and guidelines of international regulatory agencies and, when permitted by local regulatory authorities allow cost-effective, safe, outpatient radionuclide therapy of cancer without isolation in hospital.

  10. Mitochondrial Energetics and Therapeutics

    PubMed Central

    Wallace, Douglas C.; Fan, Weiwei; Procaccio, Vincent

    2011-01-01

    Mitochondrial dysfunction has been linked to a wide range of degenerative and metabolic diseases, cancer, and aging. All these clinical manifestations arise from the central role of bioenergetics in cell biology. Although genetic therapies are maturing as the rules of bioenergetic genetics are clarified, metabolic therapies have been ineffectual. This failure results from our limited appreciation of the role of bioenergetics as the interface between the environment and the cell. A systems approach, which, ironically, was first successfully applied over 80 years ago with the introduction of the ketogenic diet, is required. Analysis of the many ways that a shift from carbohydrate glycolytic metabolism to fatty acid and ketone oxidative metabolism may modulate metabolism, signal transduction pathways, and the epigenome gives us an appreciation of the ketogenic diet and the potential for bioenergetic therapeutics. PMID:20078222

  11. Antimicrobial peptides: therapeutic potentials.

    PubMed

    Kang, Su-Jin; Park, Sung Jean; Mishig-Ochir, Tsogbadrakh; Lee, Bong-Jin

    2014-12-01

    The increasing appearance of multidrug-resistant pathogens has created an urgent need for suitable alternatives to current antibiotics. Antimicrobial peptides (AMPs), which act as defensive weapons against microbes, have received great attention because of broad-spectrum activities, unique action mechanisms and rare antibiotic-resistant variants. Despite desirable characteristics, they have shown limitations in pharmaceutical development due to toxicity, stability and manufacturing costs. Because of these drawbacks, only a few AMPs have been tested in Phase III clinical trials and no AMPs have been approved by the US FDA yet. However, these obstacles could be overcome by well-known methods such as changing physicochemical characteristics and introducing nonnatural amino acids, acetylation or amidation, as well as modern techniques like molecular targeted AMPs, liposomal formulations and drug delivery systems. Thus, the current challenge in this field is to develop therapeutic AMPs at a reasonable cost as well as to overcome the limitations.

  12. Aptamers in Therapeutics

    PubMed Central

    2016-01-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  13. Microfabricated therapeutic actuators

    DOEpatents

    Lee, Abraham P.; Northrup, M. Allen; Ciarlo, Dino R.; Krulevitch, Peter A.; Benett, William J.

    1999-01-01

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use.

  14. Microfabricated therapeutic actuators

    DOEpatents

    Lee, A.P.; Northrup, M.A.; Ciarlo, D.R.; Krulevitch, P.A.; Benett, W.J.

    1999-06-15

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use. 8 figs.

  15. Race-based therapeutics.

    PubMed

    Yancy, Clyde W

    2008-08-01

    The issue of race in medicine is problematic. Race is not a physiologic grouping, and all persons of a given race do not necessarily share the same clinical phenotype or genetic substrate. Despite clear signals that certain risk factors and diseases vary as a function of race, translating those differences into race-based therapeutics has been awkward and has done little to change the natural history of cardiovascular disease as it affects special populations. Among the varied special populations, the African American population appears to have the most significant and adverse variances for cardiovascular disease as well as worrisome signals that drug responsiveness varies. Recent guideline statements have now acknowledged certain treatment options that are most appropriate for African Americans with cardiovascular disease, especially hypertension and heart failure. As more physiologic markers of disease and drug responsiveness become available, the need for racial designations in medicine may lessen, and therapies can be optimized for all patients without regard to race or ethnicity.

  16. A panic attack in therapeutic recreation over being considered therapeutic.

    PubMed

    Lee, L L

    1987-01-01

    Ancillary professions have been called upon to account for therapeutic benefits from their services or be eliminated from the health care system. A singular focus on therapy, however, would negate the unique contribution of therapeutic recreation within, while simultaneously restricting services to health care settings. It is proposed that panic over therapeutic recreation services meeting health care goals has hindered evaluation and solidification of the leisure-based philosophy presented in the NTRS Philosophical Position Statement (NTRS, 1982). It is argued that emphasizing the leisure orientation of the philosophical position statement can secure therapeutic recreation's position within, yet, not deny services to those outside of the health care system. An overview is presented on the adequacy of the position statement philosophy for therapeutic recreation. A potential danger of attempting to explain therapeutic recreation in terms of non-leisure based philosophies is also discussed.

  17. Thermodynamics of Error Correction

    NASA Astrophysics Data System (ADS)

    Sartori, Pablo; Pigolotti, Simone

    2015-10-01

    Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  18. Individualized correction of insulin measurement in hemolyzed serum samples.

    PubMed

    Wu, Zhi-Qi; Lu, Ju; Chen, Huanhuan; Chen, Wensen; Xu, Hua-Guo

    2016-11-05

    Insulin measurement plays a key role in the investigation of patients with hypoglycemia, subtype classification of diabetes mellitus, insulin resistance, and impaired beta cell function. However, even slight hemolysis can negatively affect insulin measurement due to RBC insulin-degrading enzyme (IDE). Here, we derived and validated an individualized correction equation in an attempt to eliminate the effects of hemolysis on insulin measurement. The effects of hemolysis on insulin measurement were studied by adding lysed self-RBCs to serum. A correction equation was derived, accounting for both percentage and exposure time of hemolysis. The performance of this individualized correction was evaluated in intentionally hemolyzed samples. Insulin concentration decreased with increasing percentage and exposure time of hemolysis. Based on the effects of hemolysis on insulin measurement of 17 donors (baseline insulin concentrations ranged from 156 to 2119 pmol/L), the individualized hemolysis correction equation was derived: INScorr = INSmeas/(0.705lgHbplasma/Hbserum - 0.001Time - 0.612). This equation can revert insulin concentrations of the intentionally hemolyzed samples to values that were statistically not different from the corresponding insulin baseline concentrations (p = 0.1564). Hemolysis could lead to a negative interference on insulin measurement; by individualized hemolysis correction equation for insulin measurement, we can correct and report reliable serum insulin results for a wide range of degrees of sample hemolysis. This correction would increase diagnostic accuracy, reduce inappropriate therapeutic decisions, and improve patient satisfaction with care.

  19. Full-Chip Layout Optimization for Process Margin Enhancement Using Model-Based Hotspot Fixing System

    NASA Astrophysics Data System (ADS)

    Kobayashi, Sachiko; Kyoh, Suigen; Kotani, Toshiya; Takekawa, Yoko; Inoue, Soichi; Nakamae, Koji

    2010-06-01

    As the design rule of integrated circuits is shrinking rapidly, it is necessary to use low-k1 lithography technologies. With low-k1 lithography, even if aggressive optical proximity correction is adopted, many sites become marginless spots, known as “hotspots”. For this problem, hotspot fixer (HSF) in design-for-manufacturability flow has been studied. In our previous work, we indicated the feasibility of layout modification using a simple line/space sizing rule for metal layers in 65-nm-node logic devices. However, in view of the continuous design-rule shrinkage and design complication, a more flexible modification method has become necessary to fix various types of hotspots. In this work, we have developed a brute-force model-based HSF. To further reduce the processing time, the hybrid flow of rule- and model-based HSFs is studied. The feasibility of such hybrid flow is studied by applying it to the full-chip layout modification of a logic test chip.

  20. Hierarchical searching in model-based LADAR ATR using statistical separability tests

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen; Sobel, Erik; Douglas, Joel

    2006-05-01

    In this work we investigate simultaneous object identification improvement and efficient library search for model-based object recognition applications. We develop an algorithm to provide efficient, prioritized, hierarchical searching of the object model database. A common approach to model-based object recognition chooses the object label corresponding to the best match score. However, due to corrupting effects the best match score does not always correspond to the correct object model. To address this problem, we propose a search strategy which exploits information contained in a number of representative elements of the library to drill down to a small class with high probability of containing the object. We first optimally partition the library into a hierarchic taxonomy of disjoint classes. A small number of representative elements are used to characterize each object model class. At each hierarchy level, the observed object is matched against the representative elements of each class to generate score sets. A hypothesis testing problem, using a distribution-free statistical test, is defined on the score sets and used to choose the appropriate class for a prioritized search. We conduct a probabilistic analysis of the computational cost savings, and provide a formula measuring the computational advantage of the proposed approach. We generate numerical results using match scores derived from matching highly-detailed CAD models of civilian ground vehicles used in 3-D LADAR ATR. We present numerical results showing effects on classification performance of significance level and representative element number in the score set hypothesis testing problem.

  1. Model-based coding of facial images based on facial muscle motion through isodensity maps

    NASA Astrophysics Data System (ADS)

    So, Ikken; Nakamura, Osamu; Minami, Toshi

    1991-11-01

    A model-based coding system has come under serious consideration for the next generation of image coding schemes, aimed at greater efficiency in TV telephone and TV conference systems. In this model-based coding system, the sender's model image is transmitted and stored at the receiving side before the start of the conversation. During the conversation, feature points are extracted from the facial image of the sender and are transmitted to the receiver. The facial expression of the sender facial is reconstructed from the feature points received and a wireframed model constructed at the receiving side. However, the conventional methods have the following problems: (1) Extreme changes of the gray level, such as in wrinkles caused by change of expression, cannot be reconstructed at the receiving side. (2) Extraction of stable feature points from facial images with irregular features such as spectacles or facial hair is very difficult. To cope with the first problem, a new algorithm based on isodensity lines which can represent detailed changes in expression by density correction has already been proposed and good results obtained. As for the second problem, we propose in this paper a new algorithm to reconstruct facial images by transmitting other feature points extracted from isodensity maps.

  2. StarPlan: A model-based diagnostic system for spacecraft

    NASA Technical Reports Server (NTRS)

    Heher, Dennis; Pownall, Paul

    1990-01-01

    The Sunnyvale Division of Ford Aerospace created a model-based reasoning capability for diagnosing faults in space systems. The approach employs reasoning about a model of the domain (as it is designed to operate) to explain differences between expected and actual telemetry; i.e., to identify the root cause of the discrepancy (at an appropriate level of detail) and determine necessary corrective action. A development environment, named Paragon, was implemented to support both model-building and reasoning. The major benefit of the model-based approach is the capability for the intelligent system to handle faults that were not anticipated by a human expert. The feasibility of this approach for diagnosing problems in a spacecraft was demonstrated in a prototype system, named StarPlan. Reasoning modules within StarPlan detect anomalous telemetry, establish goals for returning the telemetry to nominal values, and create a command plan for attaining the goals. Before commands are implemented, their effects are simulated to assure convergence toward the goal. After the commands are issued, the telemetry is monitored to assure that the plan is successful. These features of StarPlan, along with associated concerns, issues and future directions, are discussed.

  3. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model based accelerator control started at SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical change with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed these application programs for a fourth time. This time, however, the programs we are developing are generic so that we will not have to do it again. We have developed an integrated system called GOLD (Generic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  4. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model-based accelerator control started at SPEAR. Since that time nearly all accelerator beamlines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP and SLC all use different control programs. Since many of these application programs are embedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed an integrated system called GOLD (Genetic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  5. Adaptive model-based control systems and methods for controlling a gas turbine

    NASA Technical Reports Server (NTRS)

    Brunell, Brent Jerome (Inventor); Mathews, Jr., Harry Kirk (Inventor); Kumar, Aditya (Inventor)

    2004-01-01

    Adaptive model-based control systems and methods are described so that performance and/or operability of a gas turbine in an aircraft engine, power plant, marine propulsion, or industrial application can be optimized under normal, deteriorated, faulted, failed and/or damaged operation. First, a model of each relevant system or component is created, and the model is adapted to the engine. Then, if/when deterioration, a fault, a failure or some kind of damage to an engine component or system is detected, that information is input to the model-based control as changes to the model, constraints, objective function, or other control parameters. With all the information about the engine condition, and state and directives on the control goals in terms of an objective function and constraints, the control then solves an optimization so the optimal control action can be determined and taken. This model and control may be updated in real-time to account for engine-to-engine variation, deterioration, damage, faults and/or failures using optimal corrective control action command(s).

  6. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  7. Models Based Practices in Physical Education: A Sociocritical Reflection

    ERIC Educational Resources Information Center

    Landi, Dillon; Fitzpatrick, Katie; McGlashan, Hayley

    2016-01-01

    In this paper, we reflect on models-based practices in physical education using a sociocritical lens. Drawing links between neoliberal moves in education, and critical approaches to the body and physicality, we take a view that models are useful tools that are worth integrating into physical education, but we are apprehensive to suggest they…

  8. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    SciTech Connect

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  9. A new distance measure for model-based sequence clustering.

    PubMed

    García-García, Darío; Parrado Hernández, Emilio; Díaz-de María, Fernando

    2009-07-01

    We review the existing alternatives for defining model-based distances for clustering sequences and propose a new one based on the Kullback-Leibler divergence. This distance is shown to be especially useful in combination with spectral clustering. For improved performance in real-world scenarios, a model selection scheme is also proposed.

  10. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    SciTech Connect

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-15

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated.

  11. Impact of Model-Based Teaching on Argumentation Skills

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral; Belek, Deniz Eren

    2014-01-01

    The purpose of this study was to examine effects of model-based teaching on students' argumentation skills. Experimental design guided to the research. The participants of the study were pre-service physics teachers. The argumentative intervention lasted seven weeks. Data for this research were collected via video recordings and written arguments.…

  12. Wind field model-based estimation of Seasat scatterometer winds

    NASA Technical Reports Server (NTRS)

    Long, David G.

    1993-01-01

    A model-based approach to estimating near-surface wind fields over the ocean from Seasat scatterometer (SASS) measurements is presented. The approach is a direct assimilation technique in which wind field model parameters are estimated directly from the scatterometer measurements of the radar backscatter of the ocean's surface using maximum likelihood principles. The wind field estimate is then computed from the estimated model parameters. The wind field model used in this approach is based on geostrophic approximation and on simplistic assumptions about the wind field vorticity and divergence but includes ageostrophic winds. Nine days of SASS data were processed to obtain unique wind estimates. Comparisons in performance to the traditional two-step (point-wise wind retrieval followed by ambiguity removal) wind estimate method and the model-based method are provided using both simulated radar backscatter measurements and actual SASS measurements. In the latter case the results are compared to wind fields determined using subjective ambiguity removal. While the traditional approach results in missing measurements and reduced effective swath width due to fore/aft beam cell coregistration problems, the model-based approach uses all available measurements to increase the effective swath width and to reduce data gaps. The results reveal that the model-based wind estimates have accuracy comparable to traditionally estimated winds with less 'noise' in the directional estimates, particularly at low wind speeds.

  13. Gravitational correction to vacuum polarization

    NASA Astrophysics Data System (ADS)

    Jentschura, U. D.

    2015-02-01

    We consider the gravitational correction to (electronic) vacuum polarization in the presence of a gravitational background field. The Dirac propagators for the virtual fermions are modified to include the leading gravitational correction (potential term) which corresponds to a coordinate-dependent fermion mass. The mass term is assumed to be uniform over a length scale commensurate with the virtual electron-positron pair. The on-mass shell renormalization condition ensures that the gravitational correction vanishes on the mass shell of the photon, i.e., the speed of light is unaffected by the quantum field theoretical loop correction, in full agreement with the equivalence principle. Nontrivial corrections are obtained for off-shell, virtual photons. We compare our findings to other works on generalized Lorentz transformations and combined quantum-electrodynamic gravitational corrections to the speed of light which have recently appeared in the literature.

  14. When Does Model-Based Control Pay Off?

    PubMed Central

    2016-01-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to “model-free” and “model-based” strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  15. Processor register error correction management

    SciTech Connect

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  16. [Nuclear transfer and therapeutic cloning].

    PubMed

    Xu, Xiao-Ming; Lei, An-Min; Hua, Jin-Lian; Dou, Zhong-Ying

    2005-03-01

    Nuclear transfer and therapeutic cloning have widespread and attractive prospects in animal agriculture and biomedical applications. We reviewed that the quality of oocytes and nuclear reprogramming of somatic donor cells were the main reasons of the common abnormalities in cloned animals and the low efficiency of cloning and showed the problems and outlets in therapeutic cloning, such as some basic problems in nuclear transfer affected clinical applications of therapeutic cloning. Study on isolation and culture of nuclear transfer embryonic stem (ntES) cells and specific differentiation of ntES cells into important functional cells should be emphasized and could enhance the efficiency. Adult stem cells could help to cure some great diseases, but could not replace therapeutic cloning. Ethics also impeded the development of therapeutic cloning. It is necessary to improve many techniques and reinforce the research of some basic theories, then somatic nuclear transfer and therapeutic cloning may apply to agriculture reproduction and benefit to human life better.

  17. In Situ Mosaic Brightness Correction

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.; Lorre, Jean J.

    2012-01-01

    In situ missions typically have pointable, mast-mounted cameras, which are capable of taking panoramic mosaics comprised of many individual frames. These frames are mosaicked together. While the mosaic software applies radiometric correction to the images, in many cases brightness/contrast seams still exist between frames. This is largely due to errors in the radiometric correction, and the absence of correction for photometric effects in the mosaic processing chain. The software analyzes the overlaps between adjacent frames in the mosaic and determines correction factors for each image in an attempt to reduce or eliminate these brightness seams.

  18. Therapeutic use of nicergoline.

    PubMed

    Winblad, Bengt; Fioravanti, Mario; Dolezal, Tomas; Logina, Inara; Milanov, Ivan Gospodinov; Popescu, Dinu Cristian; Solomon, Alina

    2008-01-01

    The ergot alkaloid derivative nicergoline became clinically available about 35 years ago in the 1970s. Nicergoline has a broad spectrum of action: (i) as an alpha(1)-adrenoceptor antagonist, it induces vasodilation and increases arterial blood flow; (ii) it enhances cholinergic and catecholaminergic neurotransmitter function; (iii) it inhibits platelet aggregation; (iv) it promotes metabolic activity, resulting in increased utilization of oxygen and glucose; and (v) it has neurotrophic and antioxidant properties. Acting on several basic pathophysiological mechanisms, nicergoline has therapeutic potential in a number of disorders. This article provides an overview of the published clinical evidence relating to the efficacy and safety of nicergoline (30 mg twice daily) in the treatment of dementia (including Alzheimer's disease and vascular dementia) and vascular and balance disorders. For dementia of different aetiologies, the therapeutic benefit of nicergoline has been established, with up to 89% of patients showing improvements in cognition and behaviour. After as little as 2 months of treatment, symptom improvement is apparent compared with placebo, and most patients are still improved or stable after 12 months. Concomitant neurophysiological changes in the brain indicate (after only 4-8 weeks' treatment) improved vigilance and information processing. In patients with balance disorders, mean improvements of 44-78% in symptom severity and quality of life have been observed with nicergoline. Although clinical experience with nicergoline in vascular disorders is limited to relatively short-term, small-scale studies, it has been successfully used in rehabilitation therapy of patients with chronic ischaemic stroke. Open-label evaluations suggest that nicergoline may also be valuable in glaucoma, depression and peripheral arterio-pathy. Adverse events of nicergoline, if any, are related to the central nervous system, the metabolic system and the overall body. Most are

  19. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  20. Non-linear control logics for vibrations suppression: a comparison between model-based and non-model-based techniques

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Orsini, Lorenzo; Resta, Ferruccio

    2015-04-01

    Non-linear behavior is present in many mechanical system operating conditions. In these cases, a common engineering practice is to linearize the equation of motion around a particular operating point, and to design a linear controller. The main disadvantage is that the stability properties and validity of the controller are local. In order to improve the controller performance, non-linear control techniques represent a very attractive solution for many smart structures. The aim of this paper is to compare non-linear model-based and non-model-based control techniques. In particular the model-based sliding-mode-control (SMC) technique is considered because of its easy implementation and the strong robustness of the controller even under heavy model uncertainties. Among the non-model-based control techniques, the fuzzy control (FC), allowing designing the controller according to if-then rules, has been considered. It defines the controller without a system reference model, offering many advantages such as an intrinsic robustness. These techniques have been tested on the pendulum nonlinear system.

  1. New orbit correction method uniting global and local orbit corrections

    NASA Astrophysics Data System (ADS)

    Nakamura, N.; Takaki, H.; Sakai, H.; Satoh, M.; Harada, K.; Kamiya, Y.

    2006-01-01

    A new orbit correction method, called the eigenvector method with constraints (EVC), is proposed and formulated to unite global and local orbit corrections for ring accelerators, especially synchrotron radiation(SR) sources. The EVC can exactly correct the beam positions at arbitrarily selected ring positions such as light source points, simultaneously reducing closed orbit distortion (COD) around the whole ring. Computer simulations clearly demonstrate these features of the EVC for both cases of the Super-SOR light source and the Advanced Light Source (ALS) that have typical structures of high-brilliance SR sources. In addition, the effects of errors in beam position monitor (BPM) reading and steering magnet setting on the orbit correction are analytically expressed and also compared with the computer simulations. Simulation results show that the EVC is very effective and useful for orbit correction and beam position stabilization in SR sources.

  2. Therapeutic cloning: The ethical limits

    SciTech Connect

    Whittaker, Peter A. . E-mail: p.whittaker@lancaster.ac.uk

    2005-09-01

    A brief outline of stem cells, stem cell therapy and therapeutic cloning is given. The position of therapeutic cloning with regard to other embryonic manipulations - IVF-based reproduction, embryonic stem formation from IVF embryos and reproductive cloning - is indicated. The main ethically challenging stages in therapeutic cloning are considered to be the nuclear transfer process including the source of eggs for this and the destruction of an embryo to provide stem cells for therapeutic use. The extremely polarised nature of the debate regarding the status of an early human embryo is noted, and some potential alternative strategies for preparing immunocompatible pluripotent stem cells are indicated.

  3. Clinical applications of therapeutic phlebotomy

    PubMed Central

    Kim, Kyung Hee; Oh, Ki Young

    2016-01-01

    Phlebotomy is the removal of blood from the body, and therapeutic phlebotomy is the preferred treatment for blood disorders in which the removal of red blood cells or serum iron is the most efficient method for managing the symptoms and complications. Therapeutic phlebotomy is currently indicated for the treatment of hemochromatosis, polycythemia vera, porphyria cutanea tarda, sickle cell disease, and nonalcoholic fatty liver disease with hyperferritinemia. This review discusses therapeutic phlebotomy and the related disorders and also offers guidelines for establishing a therapeutic phlebotomy program. PMID:27486346

  4. Designing phage therapeutics.

    PubMed

    Goodridge, Lawrence D

    2010-01-01

    Phage therapy is the application of phages to bodies, substances, or environments to effect the biocontrol of pathogenic or nuisance bacteria. To be effective, phages, minimally, must be capable of attaching to bacteria (adsorption), killing those bacteria (usually associated with phage infection), and otherwise surviving (resisting decay) until they achieve attachment and subsequent killing. While a strength of phage therapy is that phages that possess appropriate properties can be chosen from a large diversity of naturally occurring phages, a more rational approach to phage therapy also can include post-isolation manipulation of phages genetically, phenotypically, or in terms of combining different products into a single formulation. Genetic manipulation, especially in these modern times, can involve genetic engineering, though a more traditional approach involves the selection of spontaneously occurring phage mutants during serial transfer protocols. While genetic modification typically is done to give rise to phenotypic changes in phages, phage phenotype alone can also be modified in vitro, prior to phage application for therapeutic purposes, as for the sake of improving phage lethality (such as by linking phage virions to antibacterial chemicals such as chloramphenicol) or survival capabilities (e.g., via virion PEGylation). Finally, phages, both naturally occurring isolates or otherwise modified constructs, can be combined into cocktails which provide collectively enhanced capabilities such as expanded overall host range. Generally these strategies represent different routes towards improving phage therapy formulations and thereby efficacy through informed design.

  5. Plasmids encoding therapeutic agents

    DOEpatents

    Keener, William K.

    2007-08-07

    Plasmids encoding anti-HIV and anti-anthrax therapeutic agents are disclosed. Plasmid pWKK-500 encodes a fusion protein containing DP178 as a targeting moiety, the ricin A chain, an HIV protease cleavable linker, and a truncated ricin B chain. N-terminal extensions of the fusion protein include the maltose binding protein and a Factor Xa protease site. C-terminal extensions include a hydrophobic linker, an L domain motif peptide, a KDEL ER retention signal, another Factor Xa protease site, an out-of-frame buforin II coding sequence, the lacZ.alpha. peptide, and a polyhistidine tag. More than twenty derivatives of plasmid pWKK-500 are described. Plasmids pWKK-700 and pWKK-800 are similar to pWKK-500 wherein the DP178-encoding sequence is substituted by RANTES- and SDF-1-encoding sequences, respectively. Plasmid pWKK-900 is similar to pWKK-500 wherein the HIV protease cleavable linker is substituted by a lethal factor (LF) peptide-cleavable linker.

  6. Therapeutic antibody technology 97.

    PubMed

    Larrick, J W; Gavilondo, J

    1998-01-01

    Almost 200 antibody aficionados attended the Therapeutic Antibody Technology 97 meeting, held September 21-24, 1997 at the Holiday Inn, Union Square in the heart of San Francisco, CA. The meeting was sponsored by the Palo Alto Institute of Molecular Medicine and organized by James W. Larrick (PAIMM) and Dennis R. Burton (Scripps Research Institute). The meeting featured excellent discussions on many interesting talks and a number of poster presentations. It is likely that another meeting will be organized in 2 years, however in the meantime, an effort is underway to organize a 'Virtual Antibody Society' to be set up on the web server at Scripps Research Institute in La Jolla, CA (Questions and comments on this project can be sent to: Jwlarrick@aol.com or Burton@scripps.edu). Richard Lerner (Scripps) gave the keynote address on 'Catalytic Antibodies', describing recent work with Carlos Barbas on so-called reactive immunization to generate a high activity aldolase catalytic antibody. This antibody, soon to be described in an article in Science, is the first commercially available catalytic antibody.

  7. Leech Therapeutic Applications

    PubMed Central

    Abdualkader, A. M.; Ghawi, A. M.; Alaama, M.; Awang, M.; Merzouk, A.

    2013-01-01

    Hematophagous animals including leeches have been known to possess biologically active compounds in their secretions, especially in their saliva. The blood-sucking annelids, leeches have been used for therapeutic purposes since the beginning of civilization. Ancient Egyptian, Indian, Greek and Arab physicians used leeches for a wide range of diseases starting from the conventional use for bleeding to systemic ailments, such as skin diseases, nervous system abnormalities, urinary and reproductive system problems, inflammation, and dental problems. Recently, extensive researches on leech saliva unveiled the presence of a variety of bioactive peptides and proteins involving antithrombin (hirudin, bufrudin), antiplatelet (calin, saratin), factor Xa inhibitors (lefaxin), antibacterial (theromacin, theromyzin) and others. Consequently, leech has made a comeback as a new remedy for many chronic and life-threatening abnormalities, such as cardiovascular problems, cancer, metastasis, and infectious diseases. In the 20th century, leech therapy has established itself in plastic and microsurgery as a protective tool against venous congestion and served to salvage the replanted digits and flaps. Many clinics for plastic surgery all over the world started to use leeches for cosmetic purposes. Despite the efficacious properties of leech therapy, the safety, and complications of leeching are still controversial. PMID:24019559

  8. Phytonutrients as therapeutic agents.

    PubMed

    Gupta, Charu; Prakash, Dhan

    2014-09-01

    Nutrients present in various foods plays an important role in maintaining the normal functions of the human body. The major nutrients present in foods include carbohydrates, proteins, lipids, vitamins, and minerals. Besides these, there are some bioactive food components known as "phytonutrients" that play an important role in human health. They have tremendous impact on the health care system and may provide medical health benefits including the prevention and/or treatment of disease and various physiological disorders. Phytonutrients play a positive role by maintaining and modulating immune function to prevent specific diseases. Being natural products, they hold a great promise in clinical therapy as they possess no side effects that are usually associated with chemotherapy or radiotherapy. They are also comparatively cheap and thus significantly reduce health care cost. Phytonutrients are the plant nutrients with specific biological activities that support human health. Some of the important bioactive phytonutrients include polyphenols, terpenoids, resveratrol, flavonoids, isoflavonoids, carotenoids, limonoids, glucosinolates, phytoestrogens, phytosterols, anthocyanins, ω-3 fatty acids, and probiotics. They play specific pharmacological effects in human health such as anti-microbial, anti-oxidants, anti-inflammatory, antiallergic, anti-spasmodic, anti-cancer, anti-aging, hepatoprotective, hypolipidemic, neuroprotective, hypotensive, diabetes, osteoporosis, CNS stimulant, analgesic, protection from UVB-induced carcinogenesis, immuno-modulator, and carminative. This mini-review attempts to summarize the major important types of phytonutrients and their role in promoting human health and as therapeutic agents along with the current market trend and commercialization.

  9. Correcting Slightly Less Simple Movements

    ERIC Educational Resources Information Center

    Aivar, M. P.; Brenner, E.; Smeets, J. B. J.

    2005-01-01

    Many studies have analysed how goal directed movements are corrected in response to changes in the properties of the target. However, only simple movements to single targets have been used in those studies, so little is known about movement corrections under more complex situations. Evidence from studies that ask for movements to several targets…

  10. 75 FR 70951 - Notice, Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... From the Federal Register Online via the Government Publishing Office NATIONAL COUNCIL ON DISABILITY (NCD) Sunshine Act Meetings Notice, Correction Type: Quarterly Meeting. Summary: NCD published a...., Suite 850, Washington, DC 20004; 202-272-2004 (voice), 202-272-2074 TTY; 202-272-2022 Fax. Correction...

  11. Error Correction, Revision, and Learning

    ERIC Educational Resources Information Center

    Truscott, John; Hsu, Angela Yi-ping

    2008-01-01

    Previous research has shown that corrective feedback on an assignment helps learners reduce their errors on that assignment during the revision process. Does this finding constitute evidence that learning resulted from the feedback? Differing answers play an important role in the ongoing debate over the effectiveness of error correction,…

  12. Feature Referenced Error Correction Apparatus.

    DTIC Science & Technology

    A feature referenced error correction apparatus utilizing the multiple images of the interstage level image format to compensate for positional...images and by the generation of an error correction signal in response to the sub-frame registration errors. (Author)

  13. Diamagnetic Corrections and Pascal's Constants

    ERIC Educational Resources Information Center

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  14. Barometric and Earth Tide Correction

    SciTech Connect

    Toll, Nathaniel J.

    2005-11-10

    BETCO corrects for barometric and earth tide effects in long-term water level records. A regression deconvolution method is used ot solve a series of linear equations to determine an impulse response function for the well pressure head. Using the response function, a pressure head correction is calculated and applied.

  15. Corrections Education Evaluation System Model.

    ERIC Educational Resources Information Center

    Nelson, Orville; And Others

    The purpose of this project was to develop an evaluation system for the competency-based vocational program developed by Wisconsin's Division of Corrections, Department of Public Instruction (DPI), and the Vocational, Technical, and Adult Education System (VTAE). Site visits were conducted at five correctional institutions in March and April of…

  16. Potential therapeutic interventions for fragile X syndrome

    PubMed Central

    Levenga, Josien; de Vrij, Femke M.S.; Oostra, Ben A.; Willemsen, Rob

    2010-01-01

    Fragile X syndrome (FXS) is caused by a lack of the fragile X mental retardation protein (FMRP); FMRP deficiency in neurons of patients with FXS causes intellectual disability (IQ<70) and several behavioural problems, including hyperactivity and autistic-like features. In the brain, no gross morphological malformations have been found, although subtle spine abnormalities have been reported. FXS has been linked to altered group I metabotropic glutamate receptor (mGluR)-dependent and independent forms of synaptic plasticity. Here, we discuss potential targeted therapeutic strategies developed to specifically correct disturbances in the excitatory mGluR and the inhibitory gamma-aminobutyric (GABA) receptor pathways that have been tested in animal models and/or in clinical trials with patients with FXS. PMID:20864408

  17. When not to trust therapeutic drug monitoring

    PubMed Central

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-01-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  18. 3-D model-based tracking for UAV indoor localization.

    PubMed

    Teulière, Céline; Marchand, Eric; Eck, Laurent

    2015-05-01

    This paper proposes a novel model-based tracking approach for 3-D localization. One main difficulty of standard model-based approach lies in the presence of low-level ambiguities between different edges. In this paper, given a 3-D model of the edges of the environment, we derive a multiple hypotheses tracker which retrieves the potential poses of the camera from the observations in the image. We also show how these candidate poses can be integrated into a particle filtering framework to guide the particle set toward the peaks of the distribution. Motivated by the UAV indoor localization problem where GPS signal is not available, we validate the algorithm on real image sequences from UAV flights.

  19. Model-based hierarchical reinforcement learning and human action control

    PubMed Central

    Botvinick, Matthew; Weinstein, Ari

    2014-01-01

    Recent work has reawakened interest in goal-directed or ‘model-based’ choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour. PMID:25267822

  20. Model-based reinforcement learning with dimension reduction.

    PubMed

    Tangkaratt, Voot; Morimoto, Jun; Sugiyama, Masashi

    2016-12-01

    The goal of reinforcement learning is to learn an optimal policy which controls an agent to acquire the maximum cumulative reward. The model-based reinforcement learning approach learns a transition model of the environment from data, and then derives the optimal policy using the transition model. However, learning an accurate transition model in high-dimensional environments requires a large amount of data which is difficult to obtain. To overcome this difficulty, in this paper, we propose to combine model-based reinforcement learning with the recently developed least-squares conditional entropy (LSCE) method, which simultaneously performs transition model estimation and dimension reduction. We also further extend the proposed method to imitation learning scenarios. The experimental results show that policy search combined with LSCE performs well for high-dimensional control tasks including real humanoid robot control.

  1. Outlier Identification in Model-Based Cluster Analysis

    PubMed Central

    Evans, Katie; Love, Tanzy; Thurston, Sally W.

    2015-01-01

    In model-based clustering based on normal-mixture models, a few outlying observations can influence the cluster structure and number. This paper develops a method to identify these, however it does not attempt to identify clusters amidst a large field of noisy observations. We identify outliers as those observations in a cluster with minimal membership proportion or for which the cluster-specific variance with and without the observation is very different. Results from a simulation study demonstrate the ability of our method to detect true outliers without falsely identifying many non-outliers and improved performance over other approaches, under most scenarios. We use the contributed R package MCLUST for model-based clustering, but propose a modified prior for the cluster-specific variance which avoids degeneracies in estimation procedures. We also compare results from our outlier method to published results on National Hockey League data. PMID:26806993

  2. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  3. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions.

  4. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  5. REAL-TIME MODEL-BASED ELECTRICAL POWERED WHEELCHAIR CONTROL

    PubMed Central

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G.; Ding, Dan; Cooper, Rory A.

    2009-01-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel-slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3-D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds. PMID:19733494

  6. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  7. Model-based inversion for a shallow ocean application

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-03-01

    A model-based approach to invert or estimate the sound speed profile (SSP) from noisy pressure-field measurements is discussed. The resulting model-based processor (MBP) is based on the state-space representation of the normal-mode propagation model. Using data obtained from the well-known Hudson Canyon experiment, a noisy shallow water ocean environment, the processor is designed and the results compared to those predicted using various propagation models and data. It is shown that the MBP not only predicts the sound speed quite well, but also is able to simultaneously provide enhanced estimates of both modal and pressure-field measurements which are useful for localization and rapid ocean environmental characterization.

  8. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  9. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  10. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  11. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  12. Model-based control of fuel cells:. (1) Regulatory control

    NASA Astrophysics Data System (ADS)

    Golbert, Joshua; Lewin, Daniel R.

    This paper describes a model-based controller for the regulation of a proton exchange membrane (PEM) fuel cell. The model accounts for spatial dependencies of voltage, current, material flows, and temperatures in the fuel channel. Analysis of the process model shows that the effective gain of the process undergoes a sign change in the normal operating range of the fuel cell, indicating that it cannot be stabilized using a linear controller with integral action. Consequently, a nonlinear model-predictive-controller based on a simplified model has been developed, enabling the use of optimal control to satisfy power demands robustly. The models and controller have been realized in the MATLAB and SIMULINK environment. Initial results indicate improved performance and robustness when using model-based control in comparison with that obtained using an adaptive controller.

  13. Real-time model based electrical powered wheelchair control.

    PubMed

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G; Ding, Dan; Cooper, Rory A

    2009-12-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds.

  14. Therapeutic Devices for Epilepsy

    PubMed Central

    Fisher, Robert S.

    2011-01-01

    Therapeutic devices provide new options for treating drug-resistant epilepsy. These devices act by a variety of mechanisms to modulate neuronal activity. Only vagus nerve stimulation, which continues to develop new technology, is approved for use in the United States. Deep brain stimulation (DBS) of anterior thalamus for partial epilepsy recently was approved in Europe and several other countries. Responsive neurostimulation, which delivers stimuli to one or two seizure foci in response to a detected seizure, recently completed a successful multicenter trial. Several other trials of brain stimulation are in planning or underway. Transcutaneous magnetic stimulation (TMS) may provide a noninvasive method to stimulate cortex. Controlled studies of TMS split on efficacy, and may depend on whether a seizure focus is near a possible region for stimulation. Seizure detection devices in the form of “shake” detectors via portable accelerometers can provide notification of an ongoing tonic-clonic seizure, or peace of mind in the absence of notification. Prediction of seizures from various aspects of EEG is in early stages. Prediction appears to be possible in a subpopulation of people with refractory seizures and a clinical trial of an implantable prediction device is underway. Cooling of neocortex or hippocampus reversibly can attenuate epileptiform EEG activity and seizures, but engineering problems remain in its implementation. Optogenetics is a new technique that can control excitability of specific populations of neurons with light. Inhibition of epileptiform activity has been demonstrated in hippocampal slices, but use in humans will require more work. In general, devices provide useful palliation for otherwise uncontrollable seizures, but with a different risk profile than with most drugs. Optimizing the place of devices in therapy for epilepsy will require further development and clinical experience. PMID:22367987

  15. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  16. The limited usefulness of models based on recollection and familiarity.

    PubMed

    Wais, Peter E

    2013-04-01

    A recent report concluded that magnetoencephalographic signals of neural activity associated with memory based on the recollection process are independent from signals associated with memory based on the familiarity process. These data can be interpreted equally well, however, as indications of memory aggregated from both processes and showing that signals associated with high-confidence recognition are dissociable from signals associated with low-confidence recognition. The usefulness of interpreting neural data according to psychological models based on recollection and familiarity is discussed.

  17. Model based control of dynamic atomic force microscope

    NASA Astrophysics Data System (ADS)

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H∞ control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  18. Model-Based Sensor Selection for Helicopter Gearbox Monitoring

    DTIC Science & Technology

    1996-04-01

    fault diagnosis of helicopter gearboxes is therefore necessary to prevent major breakdowns due to progression of undetected...in the gearbox . Once the presence of a fault is prompted by the fault detection network, fault diagnosis is performed by the Structure-Based...Components Figure 3: Overview of fault detection and diagnosis in the proposed model-based di- agnostic system for helicopter gearboxes . the OH-58A gearbox

  19. Model based control of dynamic atomic force microscope.

    PubMed

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  20. Model based control of dynamic atomic force microscope

    SciTech Connect

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  1. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  2. The Evolution of Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Riley, Bob; Skalko, Thomas K.

    1998-01-01

    Reviews elements that impact the delivery of therapeutic recreation services, emphasizing elements that are external to the discipline and influence practice and elements that are internal to the discipline and must be addressed if therapeutic recreation is to continue its evolution as a competitive health and human service discipline.…

  3. Toward Constructing the Therapeutic System.

    ERIC Educational Resources Information Center

    Andolfi, Maurizio; Angelo, Claudio

    1988-01-01

    Describes the therapist as an active participant in the construction of the therapeutic system, explaining how the therapist constructs complex relationships within the evolving therapeutic process. Reevaluates the importance of the individual in the family as an agent of change and as a mediator of triangular relational messages. (Author/NB)

  4. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate.

  5. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  6. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  7. Gaussian model-based partitioning using iterated local search.

    PubMed

    Brusco, Michael J; Shireman, Emilie; Steinley, Douglas; Brudvig, Susan; Cradit, J Dennis

    2017-02-01

    The emergence of Gaussian model-based partitioning as a viable alternative to K-means clustering fosters a need for discrete optimization methods that can be efficiently implemented using model-based criteria. A variety of alternative partitioning criteria have been proposed for more general data conditions that permit elliptical clusters, different spatial orientations for the clusters, and unequal cluster sizes. Unfortunately, many of these partitioning criteria are computationally demanding, which makes the multiple-restart (multistart) approach commonly used for K-means partitioning less effective as a heuristic solution strategy. As an alternative, we propose an approach based on iterated local search (ILS), which has proved effective in previous combinatorial data analysis contexts. We compared multistart, ILS and hybrid multistart-ILS procedures for minimizing a very general model-based criterion that assumes no restrictions on cluster size or within-group covariance structure. This comparison, which used 23 data sets from the classification literature, revealed that the ILS and hybrid heuristics generally provided better criterion function values than the multistart approach when all three methods were constrained to the same 10-min time limit. In many instances, these differences in criterion function values reflected profound differences in the partitions obtained.

  8. Model-based pattern dummy generation for logic devices

    NASA Astrophysics Data System (ADS)

    Jang, Jongwon; Kim, Cheolkyun; Ko, Sungwoo; Byun, Seokyoung; Yang, Hyunjo; Yim, Donggyu

    2014-03-01

    The insertion of SRAF(Sub-Resolution Assist Feature) is one of the most frequently used method to enlarge the process window area. In most cases, the size of SRAF is proportional to the focus margin of drawn patterns. However, there is a trade-off between the SRAF size and SRAF printing, because SRAF is not supposed to be patterned on a wafer. For this reason, a lot of OPC engineers have been tried to put bigger and more SRAFs within the limits of the possible. The fact that many papers about predicting SRAF printability have been published recent years reflects this circumstance. Pattern dummy is inserted to enhance the lithographic process margin and CD uniformity unlike CMP dummy for uniform metal line height. It is ordinary to put pattern dummy at the designated location under consideration of the pitch of real patterns at design step. However, it is not always desirable to generate pattern dummies based on rules at the lithographic point of view. In this paper, we introduce the model based pattern dummy insertion method, which is putting pattern dummies at the location that model based SRAF is located. We applied the model based pattern dummy to the layers in logic devices, and studied which layer is more efficient for the insertion of dummies.

  9. Metrics for antibody therapeutics development.

    PubMed

    Reichert, Janice M

    2010-01-01

    A wide variety of full-size monoclonal antibodies (mAbs) and therapeutics derived from alternative antibody formats can be produced through genetic and biological engineering techniques. These molecules are now filling the preclinical and clinical pipelines of every major pharmaceutical company and many biotechnology firms. Metrics for the development of antibody therapeutics, including averages for the number of candidates entering clinical study and development phase lengths for mAbs approved in the United States, were derived from analysis of a dataset of over 600 therapeutic mAbs that entered clinical study sponsored, at least in part, by commercial firms. The results presented provide an overview of the field and context for the evaluation of on-going and prospective mAb development programs. The expansion of therapeutic antibody use through supplemental marketing approvals and the increase in the study of therapeutics derived from alternative antibody formats are discussed.

  10. Transdermal delivery of therapeutic agent

    NASA Technical Reports Server (NTRS)

    Kwiatkowski, Krzysztof C. (Inventor); Hayes, Ryan T. (Inventor); Magnuson, James W. (Inventor); Giletto, Anthony (Inventor)

    2008-01-01

    A device for the transdermal delivery of a therapeutic agent to a biological subject that includes a first electrode comprising a first array of electrically conductive microprojections for providing electrical communication through a skin portion of the subject to a second electrode comprising a second array of electrically conductive microprojections. Additionally, a reservoir for holding the therapeutic agent surrounding the first electrode and a pulse generator for providing an exponential decay pulse between the first and second electrodes may be provided. A method includes the steps of piercing a stratum corneum layer of skin with two arrays of conductive microprojections, encapsulating the therapeutic agent into biocompatible charged carriers, surrounding the conductive microprojections with the therapeutic agent, generating an exponential decay pulse between the two arrays of conductive microprojections to create a non-uniform electrical field and electrokinetically driving the therapeutic agent through the stratum corneum layer of skin.

  11. Bacteriophage Procurement for Therapeutic Purposes

    PubMed Central

    Weber-Dąbrowska, Beata; Jończyk-Matysiak, Ewa; Żaczek, Maciej; Łobocka, Małgorzata; Łusiak-Szelachowska, Marzanna; Górski, Andrzej

    2016-01-01

    Bacteriophages (phages), discovered 100 years ago, are able to infect and destroy only bacterial cells. In the current crisis of antibiotic efficacy, phage therapy is considered as a supplementary or even alternative therapeutic approach. Evolution of multidrug-resistant and pandrug-resistant bacterial strains poses a real threat, so it is extremely important to have the possibility to isolate new phages for therapeutic purposes. Our phage laboratory and therapy center has extensive experience with phage isolation, characterization, and therapeutic application. In this article we present current progress in bacteriophages isolation and use for therapeutic purposes, our experience in this field and its practical implications for phage therapy. We attempt to summarize the state of the art: properties of phages, the methods for their isolation, criteria of phage selection for therapeutic purposes and limitations of their use. Perspectives for the use of genetically engineered phages to specifically target bacterial virulence-associated genes are also briefly presented. PMID:27570518

  12. Using rule-based shot dose assignment in model-based MPC applications

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  13. Automated model-based calibration of imaging spectrographs

    NASA Astrophysics Data System (ADS)

    Kosec, Matjaž; Bürmen, Miran; Tomaževič, Dejan; Pernuš, Franjo; Likar, Boštjan

    2012-03-01

    Hyper-spectral imaging has gained recognition as an important non-invasive research tool in the field of biomedicine. Among the variety of available hyperspectral imaging systems, systems comprising an imaging spectrograph, lens, wideband illumination source and a corresponding camera stand out for the short acquisition time and good signal to noise ratio. The individual images acquired by imaging spectrograph-based systems contain full spectral information along one spatial dimension. Due to the imperfections in the camera lens and in particular the optical components of the imaging spectrograph, the acquired images are subjected to spatial and spectral distortions, resulting in scene dependent nonlinear spectral degradations and spatial misalignments which need to be corrected. However, the existing correction methods require complex calibration setups and a tedious manual involvement, therefore, the correction of the distortions is often neglected. Such simplified approach can lead to significant errors in the analysis of the acquired hyperspectral images. In this paper, we present a novel fully automated method for correction of the geometric and spectral distortions in the acquired images. The method is based on automated non-rigid registration of the reference and acquired images corresponding to the proposed calibration object incorporating standardized spatial and spectral information. The obtained transformation was successfully used for sub-pixel correction of various hyperspectral images, resulting in significant improvement of the spectral and spatial alignment. It was found that the proposed calibration is highly accurate and suitable for routine use in applications involving either diffuse reflectance or transmittance measurement setups.

  14. Chaperones as potential therapeutics for Krabbe disease.

    PubMed

    Graziano, Adriana Carol Eleonora; Pannuzzo, Giovanna; Avola, Rosanna; Cardile, Venera

    2016-11-01

    Krabbe's disease (KD) is an autosomal recessive, neurodegenerative disorder. It is classified among the lysosomal storage diseases (LSDs). It was first described in , but the genetic defect for the galactocerebrosidase (GALC) gene was not discovered until the beginning of the 1970s, 20 years before the GALC cloning. Recently, in 2011, the crystal structures of the GALC enzyme and the GALC-product complex were obtained. For this, compared with other LSDs, the research on possible therapeutic interventions is much more recent. Thus, it is not surprising that some treatment options are still under preclinical investigation, whereas their relevance for other pathologies of the same group has already been tested in clinical studies. This is specifically the case for pharmacological chaperone therapy (PCT), a promising strategy for selectively correcting defective protein folding and trafficking and for enhancing enzyme activity by small molecules. These compounds bind directly to a partially folded biosynthetic intermediate, stabilize the protein, and allow completion of the folding process to yield a functional protein. Here, we review the chaperones that have demonstrated potential therapeutics during preclinical studies for KD, underscoring the requirement to invigorate research for KD-addressed PCT that will benefit from recent insights into the molecular understanding of GALC structure, drug design, and development in cellular models. © 2016 Wiley Periodicals, Inc.

  15. Potential therapeutic approaches for Angelman syndrome

    PubMed Central

    Bi, Xiaoning; Sun, Jiandong; Ji, Angela X.; Baudry, Michel

    2016-01-01

    INTRODUCTION Angelman syndrome (AS) is a neurodevelopmental disorder caused by deficiency of maternally inherited UBE3A, an ubiquitin E3 ligase. Despite recent progress in understanding the mechanism underlying UBE3A imprinting, there is no effective treatment. Further investigation of the roles played by UBE3A in the central nervous system (CNS) is needed for developing effective therapies. AREA COVERED This review covers the literature related to genetic classifications of AS, recent discoveries regarding the regulation of UBE3A imprinting, alterations in cell signaling in various brain regions, and potential therapeutic approaches. Since a large proportion of AS patients exhibit comorbid autism spectrum disorder (ASD), potential common molecular bases are discussed. EXPERT OPINION Advances in understanding UBE3A imprinting provide a unique opportunity to induce paternal UBE3A expression, thus targeting the syndrome at its “root.” However, such efforts have yielded less-than-expected rescue effects in AS mouse models, raising the concern that activation of paternal UBE3A after a critical period cannot correct all the CNS defects that developed in a UBE3A-deficient environment. On the other hand, targeting abnormal downstream cell signaling pathways has provided promising rescue effects in preclinical research. Thus, combined reinstatement of paternal UBE3A expression with targeting abnormal signaling pathways should provide better therapeutic effects. PMID:26558806

  16. Clinical, epidemiological, and therapeutic profile of dermatophytosis*

    PubMed Central

    Pires, Carla Andréa Avelar; da Cruz, Natasha Ferreira Santos; Lobato, Amanda Monteiro; de Sousa, Priscila Oliveira; Carneiro, Francisca Regina Oliveira; Mendes, Alena Margareth Darwich

    2014-01-01

    BACKGROUND The cutaneous mycoses, mainly caused by dermatophyte fungi, are among the most common fungal infections worldwide. It is estimated that 10% to 15% of the population will be infected by a dermatophyte at some point in their lives, thus making this a group of diseases with great public health importance. OBJECTIVE To analyze the clinical, epidemiological, and therapeutic profile of dermatophytosis in patients enrolled at the Dermatology service of Universidade do Estado do Pará, Brazil, from July 2010 to September 2012. METHOD A total of 145 medical records of patients diagnosed with dermatophytosis were surveyed. Data were collected and subsequently recorded according to a protocol developed by the researchers. This protocol consisted of information regarding epidemiological and clinical aspects of the disease and the therapy employed. RESULTS The main clinical form of dermatophyte infection was onychomycosis, followed by tinea corporis, tinea pedis, and tinea capitis. Furthermore, the female population and the age group of 51 to 60 years were the most affected. Regarding therapy, there was a preference for treatments that combine topical and systemic drugs, and the most widely used drugs were fluconazole (systemic) and ciclopirox olamine (topical). CONCLUSION This study showed the importance of recurrent analysis of the epidemiological profile of dermatophytosis to enable correct therapeutic and preventive management of these conditions, which have significant clinical consequences, with chronic, difficult-totreat lesions that can decrease patient quality of life and cause disfigurement. PMID:24770502

  17. Therapeutic approaches for spinal cord injury

    PubMed Central

    Cristante, Alexandre Fogaça; de Barros Filho, Tarcísio Eloy Pessoa; Marcon, Raphael Martus; Letaif, Olavo Biraghi; da Rocha, Ivan Dias

    2012-01-01

    This study reviews the literature concerning possible therapeutic approaches for spinal cord injury. Spinal cord injury is a disabling and irreversible condition that has high economic and social costs. There are both primary and secondary mechanisms of damage to the spinal cord. The primary lesion is the mechanical injury itself. The secondary lesion results from one or more biochemical and cellular processes that are triggered by the primary lesion. The frustration of health professionals in treating a severe spinal cord injury was described in 1700 BC in an Egyptian surgical papyrus that was translated by Edwin Smith; the papyrus reported spinal fractures as a “disease that should not be treated.” Over the last two decades, several studies have been performed to obtain more effective treatments for spinal cord injury. Most of these studies approach a patient with acute spinal cord injury in one of four manners: corrective surgery or a physical, biological or pharmacological treatment method. Science is unraveling the mechanisms of cell protection and neuroregeneration, but clinically, we only provide supportive care for patients with spinal cord injuries. By combining these treatments, researchers attempt to enhance the functional recovery of patients with spinal cord injuries. Advances in the last decade have allowed us to encourage the development of experimental studies in the field of spinal cord regeneration. The combination of several therapeutic strategies should, at minimum, allow for partial functional recoveries for these patients, which could improve their quality of life. PMID:23070351

  18. Development of Novel Activin-Targeted Therapeutics

    PubMed Central

    Chen, Justin L; Walton, Kelly L; Al-Musawi, Sara L; Kelly, Emily K; Qian, Hongwei; La, Mylinh; Lu, Louis; Lovrecz, George; Ziemann, Mark; Lazarus, Ross; El-Osta, Assam; Gregorevic, Paul; Harrison, Craig A

    2015-01-01

    Soluble activin type II receptors (ActRIIA/ActRIIB), via binding to diverse TGF-β proteins, can increase muscle and bone mass, correct anemia or protect against diet-induced obesity. While exciting, these multiple actions of soluble ActRIIA/IIB limit their therapeutic potential and highlight the need for new reagents that target specific ActRIIA/IIB ligands. Here, we modified the activin A and activin B prodomains, regions required for mature growth factor synthesis, to generate specific activin antagonists. Initially, the prodomains were fused to the Fc region of mouse IgG2A antibody and, subsequently, “fastener” residues (Lys45, Tyr96, His97, and Ala98; activin A numbering) that confer latency to other TGF-β proteins were incorporated. For the activin A prodomain, these modifications generated a reagent that potently (IC50 5 nmol/l) and specifically inhibited activin A signaling in vitro, and activin A-induced muscle wasting in vivo. Interestingly, the modified activin B prodomain inhibited both activin A and B signaling in vitro (IC50 ~2 nmol/l) and in vivo, suggesting it could serve as a general activin antagonist. Importantly, unlike soluble ActRIIA/IIB, the modified prodomains did not inhibit myostatin or GDF-11 activity. To underscore the therapeutic utility of specifically antagonising activin signaling, we demonstrate that the modified activin prodomains promote significant increases in muscle mass. PMID:25399825

  19. Exubera. Inhale therapeutic systems.

    PubMed

    Bindra, Sanjit; Cefalu, William T

    2002-05-01

    Inhale, in colaboration with Pfizer and Aventis Pharma (formerly Hoechst Marion Roussel; HMR), is developing an insulin formulation utilizing its pulmonary delivery technology for macromolecules for the potential treatment of type I and II diabetes. By July 2001, the phase III program had been completed and the companies had begun to assemble data for MAA and NDA filings; however, it was already clear at this time that additional data might be required for filing. By December 2001, it had been decided that the NDA should include an increased level of controlled, long-term pulmonary safety data in diabetic patients and a major study was planned to be completed in 2002, with the NDA filed thereafter (during 2002). US-05997848 was issued to Inhale Therapeutic Systems in December 1999, and corresponds to WO-09524183, filed in February 1995. Equivalent applications have appeared to date in Australia, Brazil, Canada, China, Czech Republic, Europe, Finland, Hungary, Japan, Norway, New Zealand, Poland and South Africa. This family of applications is specific to pulmonary delivery of insulin. In February 1999, Lehman Brothers gave this inhaled insulin a 60% probability of reaching market, with a possible launch date of 2001. The analysts estimated peak sales at $3 billion in 2011. In May 2000, Aventis predicted that estimated peak sales would be in excess of $1 billion. In February 2000, Merrill Lynch expected product launch in 2002 and predicted that it would be a multibillion-dollar product. Analysts Merril Lynch predicted, in September and November 2000, that the product would be launched by 2002, with sales in that year of e75 million, rising to euro 500 million in 2004. In April 2001, Merrill Lynch predicted that filing for this drug would occur in 2001. Following the report of the potential delay in regulatory filing, issued in July 2001, Deutsche Banc Alex Brown predicted a filing would take place in the fourth quarter of 2002 and launch would take place in the first

  20. BP artificial neural network based wave front correction for sensor-less free space optics communication

    NASA Astrophysics Data System (ADS)

    Li, Zhaokun; Zhao, Xiaohui

    2017-02-01

    The sensor-less adaptive optics (AO) is one of the most promising methods to compensate strong wave front disturbance in free space optics communication (FSO). The back propagation (BP) artificial neural network is applied for the sensor-less AO system to design a distortion correction scheme in this study. This method only needs one or a few online measurements to correct the wave front distortion compared with other model-based approaches, by which the real-time capacity of the system is enhanced and the Strehl Ratio (SR) is largely improved. Necessary comparisons in numerical simulation with other model-based and model-free correction methods proposed in Refs. [6,8,9,10] are given to show the validity and advantage of the proposed method.

  1. Historical review: Cytokines as therapeutics and targets of therapeutics.

    PubMed

    Vilcek, Jan; Feldmann, Marc

    2004-04-01

    Cytokine research has spawned the introduction of new therapies that have revolutionized the treatment of many important diseases. These therapeutic advances have resulted from two very different strategies. The first therapeutic strategy embodies the administration of purified, recombinant cytokines. The second relies on the administration of therapeutics that inhibit the harmful effects of upregulated, endogenous cytokines. Examples of successful cytokine therapeutics include hematopoietic growth factors (colony stimulating factors) and interferons. Prime examples of cytokine antagonists that have profoundly altered the treatment of some inflammatory disorders are agents that inhibit the effects of tumor necrosis factor (TNF). In this article, we highlight some of the studies that have been responsible for the introduction of cytokine and anti-cytokine therapies, with emphasis on the development of interferons and anti-TNF agents.

  2. Nucleic acids as therapeutic agents.

    PubMed

    Alvarez-Salas, Luis M

    2008-01-01

    Therapeutic nucleic acids (TNAs) and its precursors are applied to treat several pathologies and infections. TNA-based therapy has different rationales and mechanisms and can be classified into three main groups: 1) Therapeutic nucleotides and nucleosides; 2) Therapeutic oligonucleotides; and 3) Therapeutic polynucleotides. This review will focus in those TNAs that have reached clinical trials with anticancer and antiviral protocols, the two most common applications of TNAs. Although therapeutic nucleotides and nucleosides that interfere with nucleic acid metabolism and DNA polymerization have been successfully used as anticancer and antiviral drugs, they often produce toxic secondary effects related to dosage and continuous use. The use of oligonucleotides such as ribozyme and antisense oligodeoxynucleotides (AS-ODNs) showed promise as therapeutic moieties but faced several issues such as nuclease sensitivity, off-target effects and efficient delivery. Nevertheless, immunostimulatory oligodeoxynucleotides and AS-ODNs represent the most successful group of therapeutic oligonucleotides in the clinic. A newer group of therapeutic oligonucleotides, the aptamers, is rapidly advancing towards early detection and treatment alternatives the have reached the commercial interest. Despite the very high in vitro efficiency of small interfering RNAs (siRNAs) they present issues with intracellular target accessibility, specificity and delivery. DNA vaccines showed great promise, but they resulted in very poor responses in the clinic and further development is uncertain. Despite their many issues, the exquisite specificity and versatility of therapeutic oligonucleotides attracts a great deal of research and resources that will certainly convert them in the TNA of choice for treating cancer and viral diseases in the near future.

  3. [Therapeutic management of neurodermatitis atopica].

    PubMed

    Kägi, M K

    1998-08-01

    The therapy of atopic dermatitis remains a challenge. The success of any therapeutic concept is based on a broad and early diagnostic approach which allows to rule out relevant provocation factors and allergens. During remission periods the regular use of a topical basic therapy consisting of drug-free emolients is recommended. Topical corticosteroids as well as systemic or local antimicrobial therapy and antihistamines are essential during periods of acute exacerbations. Although during the last years a great number of new therapeutic approaches have been published, data of most of these therapeutic modalities are not sufficient to allow an unrestricted use in all patients with atopic dermatitis.

  4. 2012 Technical Corrections Fact Sheet

    EPA Pesticide Factsheets

    Final Rule: 2012 Technical Corrections, Clarifying and Other Amendments to theGreenhouse Gas Reporting Rule, and Confidentiality Determinations for Certain DataElements of the Fluorinated Gas Source Category

  5. Correction of the crooked nose.

    PubMed

    Potter, Jason K

    2012-02-01

    Correction of the deviated nose is one of the most difficult tasks in rhinoplasty surgery and should be approached in a systematic manner to ensure a satisfied patient and surgeon. Correction of the deviated nose is unique in that the patient's complaints frequently include aesthetic and functional characteristics. Equal importance should be given to the preoperative, intraoperative, and postoperative aspects of the patient's treatment to ensure a favorable outcome.

  6. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  7. Quantum error correction for beginners.

    PubMed

    Devitt, Simon J; Munro, William J; Nemoto, Kae

    2013-07-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future.

  8. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  9. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  10. Stabilization of model-based networked control systems

    NASA Astrophysics Data System (ADS)

    Miranda, Francisco; Abreu, Carlos; Mendes, Paulo M.

    2016-06-01

    A class of networked control systems called Model-Based Networked Control Systems (MB-NCSs) is considered. Stabilization of MB-NCSs is studied using feedback controls and simulation of stabilization for different feedbacks is made with the purpose to reduce the network trafic. The feedback control input is applied in a compensated model of the plant that approximates the plant dynamics and stabilizes the plant even under slow network conditions. Conditions for global exponential stabilizability and for the choosing of a feedback control input for a given constant time between the information moments of the network are derived. An optimal control problem to obtain an optimal feedback control is also presented.

  11. Fiber optic displacement measurement model based on finite reflective surface

    NASA Astrophysics Data System (ADS)

    Li, Yuhe; Guan, Kaisen; Hu, Zhaohui

    2016-10-01

    We present a fiber optic displacement measurement model based on finite reflective plate. The theoretical model was derived, and simulation analysis of light intensity distribution, reflective plate width, and the distance between fiber probe and reflective plate were conducted in details. The three dimensional received light intensity distribution and the characteristic curve of light intensity were studied as functions of displacement of finite reflective plate. Experiments were carried out to verify the established model. The physical fundamentals and the effect of operating parameters on measuring system performance were revealed in the end.

  12. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  13. Evaluating model accuracy for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Roden, Joseph

    1992-01-01

    Described here is an approach to automatically assessing the accuracy of various components of a model. In this approach, actual data from the operation of a target system is used to drive statistical measures to evaluate the prediction accuracy of various portions of the model. We describe how these statistical measures of model accuracy can be used in model-based reasoning for monitoring and design. We then describe the application of these techniques to the monitoring and design of the water recovery system of the Environmental Control and Life Support System (ECLSS) of Space Station Freedom.

  14. Kinetic modeling based probabilistic segmentation for molecular images.

    PubMed

    Saad, Ahmed; Hamarneh, Ghassan; Möller, Torsten; Smith, Ben

    2008-01-01

    We propose a semi-supervised, kinetic modeling based segmentation technique for molecular imaging applications. It is an iterative, self-learning algorithm based on uncertainty principles, designed to alleviate low signal-to-noise ratio (SNR) and partial volume effect (PVE) problems. Synthetic fluorodeoxyglucose (FDG) and simulated Raclopride dynamic positron emission tomography (dPET) brain images with excessive noise levels are used to validate our algorithm. We show, qualitatively and quantitatively, that our algorithm outperforms state-of-the-art techniques in identifying different functional regions and recovering the kinetic parameters.

  15. Purely optical navigation with model-based state prediction

    NASA Astrophysics Data System (ADS)

    Sendobry, Alexander; Graber, Thorsten; Klingauf, Uwe

    2010-10-01

    State-of-the-art Inertial Navigation Systems (INS) based on Micro-Electro-Mechanical Systems (MEMS) have a lack of precision especially in GPS denied environments like urban canyons or in pure indoor missions. The proposed Optical Navigation System (ONS) provides bias free ego-motion estimates using triple redundant sensor information. In combination with a model based state prediction our system is able to estimate velocity, position and attitude of an arbitrary aircraft. Simulating a high performance flow-field estimator the algorithm can compete with conventional low-cost INS. By using measured velocities instead of accelerations the system states drift behavior is not as distinctive as for an INS.

  16. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  17. A model-based executive for commanding robot teams

    NASA Technical Reports Server (NTRS)

    Barrett, Anthony

    2005-01-01

    The paper presents a way to robustly command a system of systems as a single entity. Instead of modeling each component system in isolation and then manually crafting interaction protocols, this approach starts with a model of the collective population as a single system. By compiling the model into separate elements for each component system and utilizing a teamwork model for coordination, it circumvents the complexities of manually crafting robust interaction protocols. The resulting systems are both globally responsive by virtue of a team oriented interaction model and locally responsive by virtue of a distributed approach to model-based fault detection, isolation, and recovery.

  18. An extended aqueous solvation model based on atom-weighted solvent accessible surface areas: SAWSA v2.0 model.

    PubMed

    Hou, Tingjun; Zhang, Wei; Huang, Qin; Xu, Xiaojie

    2005-02-01

    A new method is proposed for calculating aqueous solvation free energy based on atom-weighted solvent accessible surface areas. The method, SAWSA v2.0, gives the aqueous solvation free energy by summing the contributions of component atoms and a correction factor. We applied two different sets of atom typing rules and fitting processes for small organic molecules and proteins, respectively. For small organic molecules, the model classified the atoms in organic molecules into 65 basic types and additionally. For small organic molecules we proposed a correction factor of "hydrophobic carbon" to account for the aggregation of hydrocarbons and compounds with long hydrophobic aliphatic chains. The contributions for each atom type and correction factor were derived by multivariate regression analysis of 379 neutral molecules and 39 ions with known experimental aqueous solvation free energies. Based on the new atom typing rules, the correlation coefficient (r) for fitting the whole neutral organic molecules is 0.984, and the absolute mean error is 0.40 kcal mol(-1), which is much better than those of the model proposed by Wang et al. and the SAWSA model previously proposed by us. Furthermore, the SAWSA v2.0 model was compared with the simple atom-additive model based on the number of atom types (NA). The calculated results show that for small organic molecules, the predictions from the SAWSA v2.0 model are slightly better than those from the atom-additive model based on NA. However, for macromolecules such as proteins, due to the connection between their molecular conformation and their molecular surface area, the atom-additive model based on the number of atom types has little predictive power. In order to investigate the predictive power of our model, a systematic comparison was performed on seven solvation models including SAWSA v2.0, GB/SA_1, GB/SA_2, PB/SA_1, PB/SA_2, AM1/SM5.2R and SM5.0R. The results showed that for organic molecules the SAWSA v2.0 model is better

  19. How to Use Equipment Therapeutically.

    ERIC Educational Resources Information Center

    Bowne, Douglas

    1986-01-01

    Shares therapeutic and economic practices surrounding equipment used in New York's Higher Horizons adventure program of therapy for troubled youth. Encourages educators, therapists, and administrators to explore relationship between equipment selection, program goals, and clients. (NEC)

  20. Mesenchymal Stem Cells as Therapeutics

    PubMed Central

    Parekkadan, Biju; Milwid, Jack M.

    2013-01-01

    Mesenchymal stem cells (MSCs) are multipotent cells that are being clinically explored as a new therapeutic for treating a variety of immune-mediated diseases. First heralded as a regenerative therapy for skeletal tissue repair, MSCs have recently been shown to modulate endogenous tissue and immune cells. Preclinical studies of the mechanism of action suggest that the therapeutic effects afforded by MSC transplantation are short-lived and related to dynamic, paracrine interactions between MSCs and host cells. Therefore, representations of MSCs as drug-loaded particles may allow for pharmacokinetic models to predict the therapeutic activity of MSC transplants as a function of drug delivery mode. By integrating principles of MSC biology, therapy, and engineering, the field is armed to usher in the next generation of stem cell therapeutics. PMID:20415588

  1. Therapeutic Recreation and Adult Education.

    ERIC Educational Resources Information Center

    Jones, David

    1993-01-01

    Therapeutic recreation is a means of empowering individuals with disabilities through arts or sports. The field has developed differently in the United States and the United Kingdom; the former emphasizes professionalization and the latter the right to adult education. (SK)

  2. Inhalation delivery of protein therapeutics.

    PubMed

    Kane, Colleen; O'Neil, Karyn; Conk, Michelle; Picha, Kristen

    2013-04-01

    Inhaled therapeutics are used routinely to treat a variety of pulmonary diseases including asthma, COPD and cystic fibrosis. In addition, biological therapies represent the fastest growing segment of approved pharmaceuticals. However, despite the increased availability of biological therapies, nearly all inhaled therapeutics are small molecule drugs with only a single inhaled protein therapeutic approved. There remains a significant unmet need for therapeutics in pulmonary diseases, and biological therapies with potential to alter disease progression represent a significant opportunity to treat these challenging diseases. This review provides a background into efforts to develop inhaled biological therapies and highlights some of the associated challenges. In addition, we speculate on the ideal properties of a biologic therapy for inhaled delivery.

  3. RNAi therapeutics for CNS disorders.

    PubMed

    Boudreau, Ryan L; Davidson, Beverly L

    2010-06-18

    RNA interference (RNAi) is a process of sequence-specific gene silencing and serves as a powerful molecular tool to manipulate gene expression in vitro and in vivo. RNAi technologies have been applied to study gene function and validate drug targets. Researchers are investigating RNAi-based compounds as novel therapeutics to treat a variety of human diseases that are currently lacking sufficient treatment. To date, numerous studies support that RNAi therapeutics can improve disease phenotypes in various rodent models of human disease. Here, we focus on the development of RNAi-based therapies aimed at treating neurological disorders for which reduction of mutant or toxic gene expression may provide clinical benefit. We review RNAi-based gene-silencing strategies, proof-of-concept studies testing therapeutic RNAi for CNS disorders, and highlight the most recent research aimed at transitioning RNAi-based therapeutics toward clinical trials.

  4. Targeted Strategies for Henipavirus Therapeutics

    PubMed Central

    Bossart, Katharine N; Bingham, John; Middleton, Deborah

    2007-01-01

    Hendra and Nipah viruses are related emergent paramyxoviruses that infect and cause disease in animals and humans. Disease manifests as a generalized vasculitis affecting multiple organs, but is the most severe in the respiratory and central nervous systems. The high case fatality and person-to-person transmission associated with the most recent NiV outbreaks, and the recent re-emergence of HeV, emphasize the importance and necessity of effective therapeutics for these novel agents. In recent years henipavirus research has revealed a more complete understanding of pathogenesis and, as a consequence, viable approaches towards vaccines and therapeutics have emerged. All strategies target early steps in viral replication including receptor binding and membrane fusion. Animal models have been developed, some of which may prove more valuable than others for evaluating the efficacy of therapeutic agents and regimes. Assessments of protective host immunity and drug pharmacokinetics will be crucial to the further advancement of therapeutic compounds. PMID:19440455

  5. An Accurate Temperature Correction Model for Thermocouple Hygrometers 1

    PubMed Central

    Savage, Michael J.; Cass, Alfred; de Jager, James M.

    1982-01-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques. In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38°C). The model based on calibration at two temperatures is superior to that based on only one calibration. The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25°C, if the calibration slopes are corrected for temperature. PMID:16662241

  6. An accurate temperature correction model for thermocouple hygrometers.

    PubMed

    Savage, M J; Cass, A; de Jager, J M

    1982-02-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques.In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38 degrees C). The model based on calibration at two temperatures is superior to that based on only one calibration.The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25 degrees C, if the calibration slopes are corrected for temperature.

  7. Correction method for line extraction in vision measurement.

    PubMed

    Shao, Mingwei; Wei, Zhenzhong; Hu, Mengjie; Zhang, Guangjun

    2015-01-01

    Over-exposure and perspective distortion are two of the main factors underlying inaccurate feature extraction. First, based on Steger's method, we propose a method for correcting curvilinear structures (lines) extracted from over-exposed images. A new line model based on the Gaussian line profile is developed, and its description in the scale space is provided. The line position is analytically determined by the zero crossing of its first-order derivative, and the bias due to convolution with the normal Gaussian kernel function is eliminated on the basis of the related description. The model considers over-exposure features and is capable of detecting the line position in an over-exposed image. Simulations and experiments show that the proposed method is not significantly affected by the exposure level and is suitable for correcting lines extracted from an over-exposed image. In our experiments, the corrected result is found to be more precise than the uncorrected result by around 45.5%. Second, we analyze perspective distortion, which is inevitable during line extraction owing to the projective camera model. The perspective distortion can be rectified on the basis of the bias introduced as a function of related parameters. The properties of the proposed model and its application to vision measurement are discussed. In practice, the proposed model can be adopted to correct line extraction according to specific requirements by employing suitable parameters.

  8. Correction Method for Line Extraction in Vision Measurement

    PubMed Central

    Shao, Mingwei; Wei, Zhenzhong; Hu, Mengjie; Zhang, Guangjun

    2015-01-01

    Over-exposure and perspective distortion are two of the main factors underlying inaccurate feature extraction. First, based on Steger’s method, we propose a method for correcting curvilinear structures (lines) extracted from over-exposed images. A new line model based on the Gaussian line profile is developed, and its description in the scale space is provided. The line position is analytically determined by the zero crossing of its first-order derivative, and the bias due to convolution with the normal Gaussian kernel function is eliminated on the basis of the related description. The model considers over-exposure features and is capable of detecting the line position in an over-exposed image. Simulations and experiments show that the proposed method is not significantly affected by the exposure level and is suitable for correcting lines extracted from an over-exposed image. In our experiments, the corrected result is found to be more precise than the uncorrected result by around 45.5%. Second, we analyze perspective distortion, which is inevitable during line extraction owing to the projective camera model. The perspective distortion can be rectified on the basis of the bias introduced as a function of related parameters. The properties of the proposed model and its application to vision measurement are discussed. In practice, the proposed model can be adopted to correct line extraction according to specific requirements by employing suitable parameters. PMID:25984762

  9. Therapeutic Vaccines for Chronic Infections

    NASA Astrophysics Data System (ADS)

    Autran, Brigitte; Carcelain, Guislaine; Combadiere, Béhazine; Debre, Patrice

    2004-07-01

    Therapeutic vaccines aim to prevent severe complications of a chronic infection by reinforcing host defenses when some immune control, albeit insufficient, can already be demonstrated and when a conventional antimicrobial therapy either is not available or has limited efficacy. We focus on the rationale and challenges behind this still controversial strategy and provide examples from three major chronic infectious diseases-human immunodeficiency virus, hepatitis B virus, and human papillomavirus-for which the efficacy of therapeutic vaccines is currently being evaluated.

  10. Functional Gene Correction for Cystic Fibrosis in Lung Epithelial Cells Generated From Patient iPSCs

    PubMed Central

    Firth, Amy L; Menon, Tushar; Parker, Gregory S; Qualls, Susan J; Lewis, Benjamin M; Ke, Eugene; Dargitz, Carl T; Wright, Rebecca; Khanna, Ajai; Gage, Fred H; Verma, Inder M

    2015-01-01

    SUMMARY Lung disease is a major cause of death in the USA, with current therapeutic approaches only serving to manage symptoms. The most common chronic and life-threatening genetic disease of the lung is Cystic fibrosis (CF) caused by mutations in the cystic fibrosis transmembrane regulator (CFTR). We have generated induced pluripotent stem cells (iPSC) from CF patients carrying a homozygous deletion of F508 in the CFTR gene, which results in defective processing of CFTR to the cell membrane. This mutation was precisely corrected using CRISPR to target corrective sequences to the endogenous CFTR genomic locus, in combination with a completely excisable selection system which significantly improved the efficiency of this correction. The corrected iPSC were subsequently differentiated to mature airway epithelial cells where recovery of normal CFTR expression and function was demonstrated. This isogenic iPSC-based model system for CF could be adapted for the development of new therapeutic approaches. PMID:26299960

  11. 75 FR 9100 - Proxy Disclosure Enhancements; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... COMMISSION 17 CFR Part 249 RIN 3235-AK28 Proxy Disclosure Enhancements; Correction AGENCY: Securities and Exchange Commission. ACTION: Final rule; correction. SUMMARY: We are making technical corrections to..., we are making three corrections to Form 8-K. We are correcting Form 8-K to add an instruction,...

  12. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  13. Model-Based Diagnostics for Propellant Loading Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.

    2011-01-01

    The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.

  14. Phase-field elasticity model based on mechanical jump conditions

    NASA Astrophysics Data System (ADS)

    Schneider, Daniel; Tschukin, Oleg; Choudhury, Abhik; Selzer, Michael; Böhlke, Thomas; Nestler, Britta

    2015-05-01

    Computational models based on the phase-field method typically operate on a mesoscopic length scale and resolve structural changes of the material and furthermore provide valuable information about microstructure and mechanical property relations. An accurate calculation of the stresses and mechanical energy at the transition region is therefore indispensable. We derive a quantitative phase-field elasticity model based on force balance and Hadamard jump conditions at the interface. Comparing the simulated stress profiles calculated with Voigt/Taylor (Annalen der Physik 274(12):573, 1889), Reuss/Sachs (Z Angew Math Mech 9:49, 1929) and the proposed model with the theoretically predicted stress fields in a plate with a round inclusion under hydrostatic tension, we show the quantitative characteristics of the model. In order to validate the elastic contribution to the driving force for phase transition, we demonstrate the absence of excess energy, calculated by Durga et al. (Model Simul Mater Sci Eng 21(5):055018, 2013), in a one-dimensional equilibrium condition of serial and parallel material chains. To validate the driving force for systems with curved transition regions, we relate simulations to the Gibbs-Thompson equilibrium condition (Johnson and Alexander, J Appl Phys 59(8):2735, 1986).

  15. Qualitative-Modeling-Based Silicon Neurons and Their Networks

    PubMed Central

    Kohno, Takashi; Sekikawa, Munehisa; Li, Jing; Nanami, Takuya; Aihara, Kazuyuki

    2016-01-01

    The ionic conductance models of neuronal cells can finely reproduce a wide variety of complex neuronal activities. However, the complexity of these models has prompted the development of qualitative neuron models. They are described by differential equations with a reduced number of variables and their low-dimensional polynomials, which retain the core mathematical structures. Such simple models form the foundation of a bottom-up approach in computational and theoretical neuroscience. We proposed a qualitative-modeling-based approach for designing silicon neuron circuits, in which the mathematical structures in the polynomial-based qualitative models are reproduced by differential equations with silicon-native expressions. This approach can realize low-power-consuming circuits that can be configured to realize various classes of neuronal cells. In this article, our qualitative-modeling-based silicon neuron circuits for analog and digital implementations are quickly reviewed. One of our CMOS analog silicon neuron circuits can realize a variety of neuronal activities with a power consumption less than 72 nW. The square-wave bursting mode of this circuit is explained. Another circuit can realize Class I and II neuronal activities with about 3 nW. Our digital silicon neuron circuit can also realize these classes. An auto-associative memory realized on an all-to-all connected network of these silicon neurons is also reviewed, in which the neuron class plays important roles in its performance. PMID:27378842

  16. Towards model-based control of Parkinson's disease.

    PubMed

    Schiff, Steven J

    2010-05-13

    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson's disease is gaining increasing acceptance. Thus, the confluence of these three developments--control theory, computational neuroscience and deep brain stimulation--offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson's disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development.

  17. Model-based diagnosis of a carbon dioxide removal assembly

    NASA Astrophysics Data System (ADS)

    Throop, David R.; Scarl, Ethan A.

    1992-03-01

    Model-based diagnosis (MBD) has been applied to a variety of mechanisms, but few of these have been in fluid flow domains. Important mechanism variables in these domains are continuous, and the mechanisms commonly contain complex recycle patterns. These properties violate some of the common assumptions for MBD. The CO2 removal assembly (CDRA) for the cabin atmosphere aboard NASA's Space Station Freedom is such a mechanism. Early work on diagnosing similar mechanisms showed that purely associative diagnostic systems could not adequately handle these mechanisms' frequent reconfigurations. This suggested a model-based approach and KATE was adapted to the domain. KATE is a constraint-based MBD shell. It has been successfully applied to liquid flow problems in handling liquid oxygen. However, that domain does not involve complex recycle streams, but the CDRA does. KATE had solved constraint sets by propagating parameter values through constraints; this method often fails on constraints sets which describe recycle systems. KATE was therefore extended to allow it to use external algebraic programs to solve its constraint sets. This paper describes the representational challenges involved in that extension, and describes adaptions which allowed KATE to work within the representational limitations imposed by those algebraic programs. It also presents preliminary results of the CDRA modeling.

  18. Application of model based control to robotic manipulators

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  19. Superelement model based parallel algorithm for vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Agrawal, O. P.; Danhof, K. J.; Kumar, R.

    1994-05-01

    This paper presents a superelement model based parallel algorithm for a planar vehicle dynamics. The vehicle model is made up of a chassis and two suspension systems each of which consists of an axle-wheel assembly and two trailing arms. In this model, the chassis is treated as a Cartesian element and each suspension system is treated as a superelement. The parameters associated with the superelements are computed using an inverse dynamics technique. Suspension shock absorbers and the tires are modeled by nonlinear springs and dampers. The Euler-Lagrange approach is used to develop the system equations of motion. This leads to a system of differential and algebraic equations in which the constraints internal to superelements appear only explicitly. The above formulation is implemented on a multiprocessor machine. The numerical flow chart is divided into modules and the computation of several modules is performed in parallel to gain computational efficiency. In this implementation, the master (parent processor) creates a pool of slaves (child processors) at the beginning of the program. The slaves remain in the pool until they are needed to perform certain tasks. Upon completion of a particular task, a slave returns to the pool. This improves the overall response time of the algorithm. The formulation presented is general which makes it attractive for a general purpose code development. Speedups obtained in the different modules of the dynamic analysis computation are also presented. Results show that the superelement model based parallel algorithm can significantly reduce the vehicle dynamics simulation time.

  20. Towards model-based control of Parkinson's disease

    PubMed Central

    Schiff, Steven J.

    2010-01-01

    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson’s disease is gaining increasing acceptance. Thus, the confluence of these three developments—control theory, computational neuroscience and deep brain stimulation—offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson’s disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development. PMID:20368246

  1. Connectotyping: model based fingerprinting of the functional connectome.

    PubMed

    Miranda-Dominguez, Oscar; Mills, Brian D; Carpenter, Samuel D; Grant, Kathleen A; Kroenke, Christopher D; Nigg, Joel T; Fair, Damien A

    2014-01-01

    A better characterization of how an individual's brain is functionally organized will likely bring dramatic advances to many fields of study. Here we show a model-based approach toward characterizing resting state functional connectivity MRI (rs-fcMRI) that is capable of identifying a so-called "connectotype", or functional fingerprint in individual participants. The approach rests on a simple linear model that proposes the activity of a given brain region can be described by the weighted sum of its functional neighboring regions. The resulting coefficients correspond to a personalized model-based connectivity matrix that is capable of predicting the timeseries of each subject. Importantly, the model itself is subject specific and has the ability to predict an individual at a later date using a limited number of non-sequential frames. While we show that there is a significant amount of shared variance between models across subjects, the model's ability to discriminate an individual is driven by unique connections in higher order control regions in frontal and parietal cortices. Furthermore, we show that the connectotype is present in non-human primates as well, highlighting the translational potential of the approach.

  2. Elastic therapeutic tape: do they have the same material properties?

    PubMed Central

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-01-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex®, ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight® 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young’s modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young’s modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape. PMID:27190472

  3. Elastic therapeutic tape: do they have the same material properties?

    PubMed

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-04-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex(®), ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight(®) 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young's modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young's modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape.

  4. Five-Year Outcomes of Therapeutic Community Treatment of Drug-Involved Offenders after Release from Prison

    ERIC Educational Resources Information Center

    Inciardi, James A.; Martin, Steven S.; Butzin, Clifford A.

    2004-01-01

    With growing numbers of drug-involved offenders, substance abuse treatment has become a critical part of corrections. A multistage therapeutic community implemented in the Delaware correctional system has as its centerpiece a residential treatment program during work release--the transition between prison and community. An evaluation of this…

  5. What is a Therapeutic HIV Vaccine?

    MedlinePlus

    ... Services HIV Overview What is a Therapeutic HIV Vaccine? (Last updated 10/17/2016; last reviewed 10/ ... from the body. What is a therapeutic HIV vaccine? A therapeutic HIV vaccine is a vaccine that’s ...

  6. A new approach to account for the medium-dependent effect in model-based dose calculations for kilovoltage x-rays.

    PubMed

    Pawlowski, Jason M; Ding, George X

    2011-07-07

    This study presents a new approach to accurately account for the medium-dependent effect in model-based dose calculations for kilovoltage (kV) x-rays. This approach is based on the hypothesis that the correction factors needed to convert dose from model-based dose calculations to absorbed dose-to-medium depend on both the attenuation characteristics of the absorbing media and the changes to the energy spectrum of the incident x-rays as they traverse media with an effective atomic number different than that of water. Using Monte Carlo simulation techniques, we obtained empirical medium-dependent correction factors that take both effects into account. We found that the correction factors can be expressed as a function of a single quantity, called the effective bone depth, which is a measure of the amount of bone that an x-ray beam must penetrate to reach a voxel. Since the effective bone depth can be calculated from volumetric patient CT images, the medium-dependent correction factors can be obtained for model-based dose calculations based on patient CT images. We tested the accuracy of this new approach on 14 patients for the case of calculating imaging dose from kilovoltage cone-beam computed tomography used for patient setup in radiotherapy, and compared it with the Monte Carlo method, which is regarded as the 'gold standard'. For all patients studied, the new approach resulted in mean dose errors of less than 3%. This is in contrast to current available inhomogeneity corrected methods, which have been shown to result in mean errors of up to -103% for bone and 8% for soft tissue. Since there is a huge gain in the calculation speed relative to the Monte Carlo method (∼two orders of magnitude) with an acceptable loss of accuracy, this approach provides an alternative accurate dose calculation method for kV x-rays.

  7. 78 FR 76193 - Special Notice; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-16

    ... Questionnaire)] Special Notice; Correction AGENCY: Veterans Benefits Administration, Department of Veterans Affairs. ACTION: Notice; correction. SUMMARY: The Department of Veterans Affairs (VA) published a... of Veterans Affairs, 810 Vermont Avenue NW., Washington, DC 20420, at (202) 632-7492. Correction...

  8. 77 FR 60039 - Availability of Records; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... Part 1631 Availability of Records; Correction AGENCY: Federal Retirement Thrift Investment Board. ACTION: Direct final rule; correction. SUMMARY: The Federal Retirement Thrift Investment Board (Agency... corrections to FRTIB regulations stemming from the direct final rule published in the February 27,...

  9. String-Corrected Black Holes

    SciTech Connect

    Hubeny, Veronika; Maloney, Alexander; Rangamani, Mukund

    2005-02-07

    We investigate the geometry of four dimensional black hole solutions in the presence of stringy higher curvature corrections to the low energy effective action. For certain supersymmetric two charge black holes these corrections drastically alter the causal structure of the solution, converting seemingly pathological null singularities into timelike singularities hidden behind a finite area horizon. We establish, analytically and numerically, that the string-corrected two-charge black hole metric has the same Penrose diagram as the extremal four-charge black hole. The higher derivative terms lead to another dramatic effect -- the gravitational force exerted by a black hole on an inertial observer is no longer purely attractive! The magnitude of this effect is related to the size of the compactification manifold.

  10. Universality of quantum gravity corrections.

    PubMed

    Das, Saurya; Vagenas, Elias C

    2008-11-28

    We show that the existence of a minimum measurable length and the related generalized uncertainty principle (GUP), predicted by theories of quantum gravity, influence all quantum Hamiltonians. Thus, they predict quantum gravity corrections to various quantum phenomena. We compute such corrections to the Lamb shift, the Landau levels, and the tunneling current in a scanning tunneling microscope. We show that these corrections can be interpreted in two ways: (a) either that they are exceedingly small, beyond the reach of current experiments, or (b) that they predict upper bounds on the quantum gravity parameter in the GUP, compatible with experiments at the electroweak scale. Thus, more accurate measurements in the future should either be able to test these predictions, or further tighten the above bounds and predict an intermediate length scale between the electroweak and the Planck scale.

  11. Error Field Correction in ITER

    SciTech Connect

    Park, Jong-kyu; Boozer, Allen H.; Menard, Jonathan E.; Schaffer, Michael J.

    2008-05-22

    A new method for correcting magnetic field errors in the ITER tokamak is developed using the Ideal Perturbed Equilibrium Code (IPEC). The dominant external magnetic field for driving islands is shown to be localized to the outboard midplane for three ITER equilibria that represent the projected range of operational scenarios. The coupling matrices between the poloidal harmonics of the external magnetic perturbations and the resonant fields on the rational surfaces that drive islands are combined for different equilibria and used to determine an ordered list of the dominant errors in the external magnetic field. It is found that efficient and robust error field correction is possible with a fixed setting of the correction currents relative to the currents in the main coils across the range of ITER operating scenarios that was considered.

  12. Corrections in Montana: A Consultation on Corrections in Montana.

    ERIC Educational Resources Information Center

    Montana State Advisory Committee to the U.S. Commission on Civil Rights, Helena.

    The findings and recommendations of a two-day conference on the civil and human rights of inmates of Montana's correctional institutions are contained in this report. The views of private citizens and experts from local, state, and federal organizations are presented in edited form under seven subject headings: existing prison reform legislation,…

  13. When Correction Turns Positive: Processing Corrective Prosody in Dutch

    PubMed Central

    Dimitrova, Diana V.; Stowe, Laurie A.; Hoeks, John C. J.

    2015-01-01

    Current research on spoken language does not provide a consistent picture as to whether prosody, the melody and rhythm of speech, conveys a specific meaning. Perception studies show that English listeners assign meaning to prosodic patterns, and, for instance, associate some accents with contrast, whereas Dutch listeners behave more controversially. In two ERP studies we tested how Dutch listeners process words carrying two types of accents, which either provided new information (new information accents) or corrected information (corrective accents), both in single sentences (experiment 1) and after corrective and new information questions (experiment 2). In both experiments corrective accents elicited a sustained positivity as compared to new information accents, which started earlier in context than in single sentences. The positivity was not modulated by the nature of the preceding question, suggesting that the underlying neural mechanism likely reflects the construction of an interpretation to the accented word, either by identifying an alternative in context or by inferring it when no context is present. Our experimental results provide strong evidence for inferential processes related to prosodic contours in Dutch. PMID:25973607

  14. Correction.

    PubMed

    2015-03-01

    In the January 2015 issue of Cyberpsychology, Behavior, and Social Networking (vol. 18, no. 1, pp. 3–7), the article "Individual Differences in Cyber Security Behaviors: An Examination of Who Is Sharing Passwords." by Prof. Monica Whitty et al., has an error in wording in the abstract. The sentence in question was originally printed as: Contrary to our hypotheses, we found older people and individuals who score high on self-monitoring were more likely to share passwords. It should read: Contrary to our hypotheses, we found younger people and individuals who score high on self-monitoring were more likely to share passwords. The authors wish to apologize for the error.

  15. Correction.

    PubMed

    1991-09-25

    In Temperature taking - getting it right' (Nursing Standard December 12 1990), the author erroneously referred to TcmpaDOT thermometers as using liquid crystal indicators. They in fact function through a colour change, using a mix of two organic chemicals.

  16. Correction

    NASA Astrophysics Data System (ADS)

    2016-09-01

    The feature article “Neutrons for new drugs” (August pp26-29) stated that neutron crystallography was used to determine the structures of “wellknown complex biological molecules such as lysine, insulin and trypsin”.

  17. Correction.

    PubMed

    1992-12-11

    Last month, the U.S. Postal Service (USPS) prompted a 13 November Random Sample naming a group of scientists whose faces were appearing, USPS said, on stamps belonging to its Black Heritage Series. Among them: chemist Percy Lavon Julian; George Washington Carver; physician Charles R. Drew; astronomer and mathematician Benjamin Banneker; and inventor Jan Matzeliger. Science readers knew better. Two of the quintet appeared years ago: a stamp bearing Carver's picture was issued in 1948, and Drew appeared in the Great Americans Series in 1981.

  18. Corrections

    NASA Astrophysics Data System (ADS)

    2004-05-01

    1. The first photograph on p12 of News in Physics Educaton January 2004 is of Prof. Paul Black and not Prof. Jonathan Osborne, as stated. 2. The review of Flowlog on p209 of the March 2004 issue wrongly gives the maximum sampling rate of the analogue inputs as 25 kHz (40 ms) instead of 25 kHz (40 µs) and the digital inputs as 100 kHz (10 ms) instead of 100 kHz (10 µs). 3. The letter entitled 'A trial of two energies' by Eric McIldowie on pp212-4 of the March 2004 issue was edited to fit the space available. We regret that a few small errors were made in doing this. Rather than detail these, the interested reader can access the whole of the original letter as a Word file from the link below.

  19. Correction

    NASA Astrophysics Data System (ADS)

    2014-09-01

    An error was made in the spelling of the name of Noah Petro, who was quoted in the news article "The summer of supermoons," published in the 19 August 2014 issue of Eos (95(33), 297, doi:10.1002/2014EO330005). Eos regrets the error.

  20. Correction

    NASA Astrophysics Data System (ADS)

    2014-09-01

    An error was made in the spelling of the name of Mark Abbott, who was quoted in the news article "Earth observation plan looks toward balancing U.S. federal priorities," published in the 19 August 2014 issue of Eos (95(33), 295-296, doi:10.1002/2014EO330003). Eos regrets the error.

  1. Correction

    NASA Astrophysics Data System (ADS)

    2014-08-01

    In the About AGU article "AGU Union Fellows elected for 2014," published in the 29 July 2014 issue of Eos (95(30), 272, doi:10.1022/ 2014EO300008), a joint research group affiliation was inadvertently omitted for one Fellow. Antje Boetius is with the Alfred Wegener Institute, Bremerhaven, Germany, and the Max Planck Institute for Marine Microbiology, Bremen, Germany.

  2. Correction

    NASA Astrophysics Data System (ADS)

    2013-03-01

    In the 5 March 2013 issue of Eos, the Forum, "Consider nominating a woman for an AGU award" (Eos, 94(10), 99, doi:10.1002/2013EO100003), incorrectly stated that of the three female AGU medalists in 2012, two became new Fellows at that time. This was based on incorrect information in AGU's records. In fact, one of the three female 2012 medalists became a new Fellow at that time; the other two had been elected Fellows earlier.

  3. Automated Confocal Microscope Bias Correction

    NASA Astrophysics Data System (ADS)

    Dorval, Thierry; Genovesio, Auguste

    2006-10-01

    Illumination artifacts systematically occur in 2D cross-section confocal microscopy imaging . These bias can strongly corrupt an higher level image processing such as a segmentation, a fluorescence evaluation or even a pattern extraction/recognition. This paper presents a new fully automated bias correction methodology based on large image database preprocessing. This method is very appropriate to the High Content Screening (HCS), method dedicated to drugs discovery. Our method assumes that the amount of pictures available is large enough to allow a reliable statistical computation of an average bias image. A relevant segmentation evaluation protocol and experimental results validate our correction algorithm by outperforming object extraction on non corrupted images.

  4. DARHT Radiographic Grid Scale Correction

    SciTech Connect

    Warthen, Barry J.

    2015-02-13

    Recently it became apparent that the radiographic grid which has been used to calibrate the dimensional scale of DARHT radiographs was not centered at the location where the objects have been centered. This offset produced an error of 0.188% in the dimensional scaling of the radiographic images processed using the assumption that the grid and objects had the same center. This paper will show the derivation of the scaling correction, explain how new radiographs are being processed to account for the difference in location, and provide the details of how to correct radiographic image processed with the erroneous scale factor.

  5. Model-based fault diagnosis in continuous dynamic systems.

    PubMed

    Lo, C H; Wong, Y K; Rad, A B

    2004-07-01

    Traditional fault detection and isolation methods are based on quantitative models which are sometimes difficult and costly to obtain. In this paper, qualitative bond graph (QBG) reasoning is adopted as the modeling scheme to generate a set of qualitative equations. The QBG method provides a unified approach for modeling engineering systems, in particular, mechatronic systems. An input-output qualitative equation derived from QBG formalism performs continuous system monitoring. Fault diagnosis is activated when a discrepancy is observed between measured abnormal behavior and predicted system behavior. Genetic algorithms (GA's) are then used to search for possible faulty components among a system of qualitative equations. In order to demonstrate the performance of the proposed algorithm, we have tested it on a laboratory scale servo-tank liquid process rig. Results of the proposed model-based fault detection and diagnosis algorithm for the process rig are presented and discussed.

  6. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  7. Model-based advanced process control of coagulation.

    PubMed

    Baxter, C W; Shariff, R; Stanley, S J; Smith, D W; Zhang, Q; Saumer, E D

    2002-01-01

    The drinking water treatment industry has seen a recent increase in the use of artificial neural networks (ANNs) for process modelling and offline process control tools and applications. While conceptual frameworks for integrating the ANN technology into the real-time control of complex treatment processes have been proposed, actual working systems have yet to be developed. This paper presents development and application of an ANN model-based advanced process control system for the coagulation process at a pilot-scale water treatment facility in Edmonton, Alberta, Canada. The system was successfully used to maintain a user-defined set point for effluent quality, by automatically varying operating conditions in response to changes in influent water quality. This new technology has the potential to realize significant operational cost saving for utilities when applied in full-scale applications.

  8. Enhancements to the KATE model-based reasoning system

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1994-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. This report describes two software efforts which enhance the functionality and usability of KATE. The first addition, a flow solver, adds to KATE a tool for modeling the flow of liquid in a pipe system. The second addition adds support for editing KATE knowledge base files to the Emacs editor. The body of this report discusses design and implementation issues having to do with these two tools. It will be useful to anyone maintaining or extending either the flow solver or the editor enhancements.

  9. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  10. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    PubMed Central

    Jie, Shao

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance. PMID:25054172

  11. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  12. On the Performance of Stochastic Model-Based Image Segmentation

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Sewchand, Wilfred

    1989-11-01

    A new stochastic model-based image segmentation technique for X-ray CT image has been developed and has been extended to the more general nondiffraction CT images which include MRI, SPELT, and certain type of ultrasound images [1,2]. The nondiffraction CT image is modeled by a Finite Normal Mixture. The technique utilizes the information theoretic criterion to detect the number of the region images, uses the Expectation-Maximization algorithm to estimate the parameters of the image, and uses the Bayesian classifier to segment the observed image. How does this technique over/under-estimate the number of the region images? What is the probability of errors in the segmentation of this technique? This paper addresses these two problems and is a continuation of [1,2].

  13. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  14. A Novel Software Evolution Model Based on Software Networks

    NASA Astrophysics Data System (ADS)

    Pan, Weifeng; Li, Bing; Ma, Yutao; Liu, Jing

    Many published papers analyzed the forming mechanisms and evolution laws of OO software systems from software reuse, software pattern, etc. There, however, have been fewer models so far merely built on the software components such as methods, classes, etc. and their interactions. In this paper, a novel Software Evolution Model based on Software Networks (called SEM-SN) is proposed. It uses software network at class level to represent software systems, and uses software network’s dynamical generating process to simulate activities in real software development process such as new classes’ dynamical creations and their dynamical interactions with already existing classes. It also introduces the concept of node/edge ageing to describe the decaying of classes with time. Empirical results on eight open-source Object-Oriented (OO) software systems demonstrate that SCM-SN roughly describes the evolution process of software systems and the emergence of their complex network characteristics.

  15. Model-based condition monitoring for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Kim, Taesic; Wang, Yebin; Fang, Huazhen; Sahinoglu, Zafer; Wada, Toshihiro; Hara, Satoshi; Qiao, Wei

    2015-11-01

    Condition monitoring for batteries involves tracking changes in physical parameters and operational states such as state of health (SOH) and state of charge (SOC), and is fundamentally important for building high-performance and safety-critical battery systems. A model-based condition monitoring strategy is developed in this paper for Lithium-ion batteries on the basis of an electrical circuit model incorporating hysteresis effect. It systematically integrates 1) a fast upper-triangular and diagonal recursive least squares algorithm for parameter identification of the battery model, 2) a smooth variable structure filter for the SOC estimation, and 3) a recursive total least squares algorithm for estimating the maximum capacity, which indicates the SOH. The proposed solution enjoys advantages including high accuracy, low computational cost, and simple implementation, and therefore is suitable for deployment and use in real-time embedded battery management systems (BMSs). Simulations and experiments validate effectiveness of the proposed strategy.

  16. The ubiquity of model-based reinforcement learning.

    PubMed

    Doll, Bradley B; Simon, Dylan A; Daw, Nathaniel D

    2012-12-01

    The reward prediction error (RPE) theory of dopamine (DA) function has enjoyed great success in the neuroscience of learning and decision-making. This theory is derived from model-free reinforcement learning (RL), in which choices are made simply on the basis of previously realized rewards. Recently, attention has turned to correlates of more flexible, albeit computationally complex, model-based methods in the brain. These methods are distinguished from model-free learning by their evaluation of candidate actions using expected future outcomes according to a world model. Puzzlingly, signatures from these computations seem to be pervasive in the very same regions previously thought to support model-free learning. Here, we review recent behavioral and neural evidence about these two systems, in attempt to reconcile their enigmatic cohabitation in the brain.

  17. Model-based parameterisation of a hydrocyclone air-core

    PubMed

    Podd; Schlaberg; Hoyle

    2000-03-01

    An important metric for the accurate control of a hydrocyclone is the diameter of its air-core. Ultrasonic data from a 16-transducer, 1.5 MHz pulse-echo tomographic system are analysed to determine the variation of the air-core diameter with various operating conditions. The back-projection image reconstruction method is not accurate enough for this task. Sub-millimetre accuracy is obtained, however, by applying a combination of signal processing and model-based reconstruction, using the fact that there is a small variation in the air-core boundary position. The findings correspond well to the results obtained from X-ray and electrical resistance modalities.

  18. The algorithmic anatomy of model-based evaluation.

    PubMed

    Daw, Nathaniel D; Dayan, Peter

    2014-11-05

    Despite many debates in the first half of the twentieth century, it is now largely a truism that humans and other animals build models of their environments and use them for prediction and control. However, model-based (MB) reasoning presents severe computational challenges. Alternative, computationally simpler, model-free (MF) schemes have been suggested in the reinforcement learning literature, and have afforded influential accounts of behavioural and neural data. Here, we study the realization of MB calculations, and the ways that this might be woven together with MF values and evaluation methods. There are as yet mostly only hints in the literature as to the resulting tapestry, so we offer more preview than review.

  19. Model-Based Systems Engineering Pilot Program at NASA Langley

    NASA Technical Reports Server (NTRS)

    Vipavetz, Kevin G.; Murphy, Douglas G.; Infeld, Samatha I.

    2012-01-01

    NASA Langley Research Center conducted a pilot program to evaluate the benefits of using a Model-Based Systems Engineering (MBSE) approach during the early phase of the Materials International Space Station Experiment-X (MISSE-X) project. The goal of the pilot was to leverage MBSE tools and methods, including the Systems Modeling Language (SysML), to understand the net gain of utilizing this approach on a moderate size flight project. The System Requirements Review (SRR) success criteria were used to guide the work products desired from the pilot. This paper discusses the pilot project implementation, provides SysML model examples, identifies lessons learned, and describes plans for further use on MBSE on MISSE-X.

  20. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  1. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  2. 77 FR 61229 - Availability of Records; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... Part 1631 Availability of Records; Correction AGENCY: Federal Retirement Thrift Investment Board... INFORMATION: This document contains corrections to FRTIB regulations stemming from the direct final...

  3. MSTAR's extensible search engine and model-based inferencing toolkit

    NASA Astrophysics Data System (ADS)

    Wissinger, John; Ristroph, Robert; Diemunsch, Joseph R.; Severson, William E.; Fruedenthal, Eric

    1999-08-01

    The DARPA/AFRL 'Moving and Stationary Target Acquisition and Recognition' (MSTAR) program is developing a model-based vision approach to Synthetic Aperture Radar (SAR) Automatic Target Recognition (ATR). The motivation for this work is to develop a high performance ATR capability that can identify ground targets in highly unconstrained imaging scenarios that include variable image acquisition geometry, arbitrary target pose and configuration state, differences in target deployment situation, and strong intra-class variations. The MSTAR approach utilizes radar scattering models in an on-line hypothesize-and-test operation that compares predicted target signature statistics with features extracted from image data in an attempt to determine a 'best fit' explanation of the observed image. Central to this processing paradigm is the Search algorithm, which provides intelligent control in selecting features to measure and hypotheses to test, as well as in making the decision about when to stop processing and report a specific target type or clutter. Intelligent management of computation performed by the Search module is a key enabler to scaling the model-based approach to the large hypothesis spaces typical of realistic ATR problems. In this paper, we describe the present state of design and implementation of the MSTAR Search engine, as it has matured over the last three years of the MSTAR program. The evolution has been driven by a continually expanding problem domain that now includes 30 target types, viewed under arbitrary squint/depression, with articulations, reconfigurations, revetments, variable background, and up to 30% blocking occlusion. We believe that the research directions that have been inspired by MSTAR's challenging problem domain are leading to broadly applicable search methodologies that are relevant to computer vision systems in many areas.

  4. Neural mass model-based tracking of anesthetic brain states.

    PubMed

    Kuhlmann, Levin; Freestone, Dean R; Manton, Jonathan H; Heyse, Bjorn; Vereecke, Hugo E M; Lipping, Tarmo; Struys, Michel M R F; Liley, David T J

    2016-06-01

    Neural mass model-based tracking of brain states from electroencephalographic signals holds the promise of simultaneously tracking brain states while inferring underlying physiological changes in various neuroscientific and clinical applications. Here, neural mass model-based tracking of brain states using the unscented Kalman filter applied to estimate parameters of the Jansen-Rit cortical population model is evaluated through the application of propofol-based anesthetic state monitoring. In particular, 15 subjects underwent propofol anesthesia induction from awake to anesthetised while behavioral responsiveness was monitored and frontal electroencephalographic signals were recorded. The unscented Kalman filter Jansen-Rit model approach applied to frontal electroencephalography achieved reasonable testing performance for classification of the anesthetic brain state (sensitivity: 0.51; chance sensitivity: 0.17; nearest neighbor sensitivity 0.75) when compared to approaches based on linear (autoregressive moving average) modeling (sensitivity 0.58; nearest neighbor sensitivity: 0.91) and a high performing standard depth of anesthesia monitoring measure, Higuchi Fractal Dimension (sensitivity: 0.50; nearest neighbor sensitivity: 0.88). Moreover, it was found that the unscented Kalman filter based parameter estimates of the inhibitory postsynaptic potential amplitude varied in the physiologically expected direction with increases in propofol concentration, while the estimates of the inhibitory postsynaptic potential rate constant did not. These results combined with analysis of monotonicity of parameter estimates, error analysis of parameter estimates, and observability analysis of the Jansen-Rit model, along with considerations of extensions of the Jansen-Rit model, suggests that the Jansen-Rit model combined with unscented Kalman filtering provides a valuable reference point for future real-time brain state tracking studies. This is especially true for studies of

  5. Toward a model-based cognitive neuroscience of mind wandering.

    PubMed

    Hawkins, G E; Mittner, M; Boekel, W; Heathcote, A; Forstmann, B U

    2015-12-03

    People often "mind wander" during everyday tasks, temporarily losing track of time, place, or current task goals. In laboratory-based tasks, mind wandering is often associated with performance decrements in behavioral variables and changes in neural recordings. Such empirical associations provide descriptive accounts of mind wandering - how it affects ongoing task performance - but fail to provide true explanatory accounts - why it affects task performance. In this perspectives paper, we consider mind wandering as a neural state or process that affects the parameters of quantitative cognitive process models, which in turn affect observed behavioral performance. Our approach thus uses cognitive process models to bridge the explanatory divide between neural and behavioral data. We provide an overview of two general frameworks for developing a model-based cognitive neuroscience of mind wandering. The first approach uses neural data to segment observed performance into a discrete mixture of latent task-related and task-unrelated states, and the second regresses single-trial measures of neural activity onto structured trial-by-trial variation in the parameters of cognitive process models. We discuss the relative merits of the two approaches, and the research questions they can answer, and highlight that both approaches allow neural data to provide additional constraint on the parameters of cognitive models, which will lead to a more precise account of the effect of mind wandering on brain and behavior. We conclude by summarizing prospects for mind wandering as conceived within a model-based cognitive neuroscience framework, highlighting the opportunities for its continued study and the benefits that arise from using well-developed quantitative techniques to study abstract theoretical constructs.

  6. Surveillance for isocyanate asthma: a model based cost effectiveness analysis

    PubMed Central

    Wild, D; Redlich, C; Paltiel, A

    2005-01-01

    Aims: Because logistical and financial obstacles impede using large prospective cohort studies, surveillance decisions in occupational settings must often be made without evidence of relative benefits and costs. Using the example of isocyanate induced asthma, the most commonly reported immune mediated occupational asthma, the authors developed a model based approach to evaluate the costs and benefits of surveillance from both an employer and a societal perspective. Methods: The authors used a mathematical simulation model of isocyanate asthma to compare annual surveillance to passive case finding. Outcome measures included symptom free days (SFD), quality adjusted life years (QALY), direct costs, productivity losses, and incremental cost effectiveness ratio (CER), measured from the employer and the societal perspectives. Input data were obtained from a variety of published sources. Results: For 100 000 exposed workers, surveillance resulted in 683 fewer cases of disability over 10 years. Surveillance conferred benefits at an incremental cost of $24,000/QALY (employer perspective; $13.33/SFD) and was cost saving from the societal perspective. Results were sensitive to assumptions about sensitisation rate, removal rates, and time to diagnosis, but not to assumptions about therapy costs and disability rates. Conclusions: Baseline results placed the CER for surveillance for isocyanate asthma within the acceptable range. Costs from the societal and employer perspective differed substantially with a more attractive CER from the societal perspective, suggesting opportunities for employer/societal cost sharing. The analysis demonstrates the value of a model based approach to evaluate the cost effectiveness of surveillance programmes for isocyanate asthma, and to inform shared decision making among clinicians, patients, employers, and society. Such a modeling approach may be applicable to surveillance programmes for other work related conditions. PMID:16234399

  7. Emerging therapeutics for Alzheimer's disease.

    PubMed

    Chiang, Karen; Koo, Edward H

    2014-01-01

    Despite decades of intense research, therapeutics for Alzheimer's disease (AD) are still limited to symptomatic treatments that possess only short-term efficacy. Recently, several large-scale Phase III trials targeting amyloid-β production or clearance have failed to show efficacy, leading to a reexamination of the amyloid hypothesis as well as highlighting the need to explore alternatives in both clinical testing strategies and drug discovery targets. In this review, we discuss therapeutics currently being tested in clinical trials and up-and-coming interventions that have shown promise in animal models, devoting attention to the mechanisms that may underlie their ability to influence disease progression and placing particular emphasis on tau therapeutics.

  8. Polymeric anti-HIV therapeutics.

    PubMed

    Danial, Maarten; Klok, Harm-Anton

    2015-01-01

    The scope of this review is to highlight the application of polymer therapeutics in an effort to curb the transmission and infection of the human immunodeficiency virus (HIV). Following a description of the HIV life cycle, the use of approved antiretroviral drugs that inhibit critical steps in the HIV infection process is highlighted. After that, a comprehensive overview of the structure and inhibitory properties of polymeric anti-HIV therapeutic agents is presented. This overview will include inhibitors based on polysaccharides, synthetic polymers, dendritic polymers, polymer conjugates as well as polymeric DC-SIGN antagonists. The review will conclude with a section that discusses the applications of polymers and polymer conjugates as systemic and topical anti-HIV therapeutics.

  9. [Therapeutic drug monitoring of levetiracetam].

    PubMed

    Dailly, Eric; Bouquié, Régis; Bentué-Ferrer, Danièle

    2010-01-01

    Levetiracetam is an anticonvulsant drug used to treat partial seizures, myoclonic seizures of juvenile myoclonic epilepsy and primary generalized tonic-clonic seizures. A review of the literature with an evidence-based medicine method highlighted parameters (age, renal failure, pregnancy, combination with other anticonvulsant drugs) which affect levetiracetam pharmacokinetics but no significant relationship between plasma concentration of levetiracetam and efficacy or toxicity. Concentrations usually observed in therapeutics is from 6 to 18 mg/L. However, the determination of an individual therapeutic concentration, associated with an effective and well tolerated therapy, could be recommended particularly before pregnancy. Consequently, therapeutic drug monitoring of levetiracetam which is not currently recommended could be possibly useful in specific clinical situations.

  10. [Therapeutic drug monitoring of oxcarbazepine].

    PubMed

    Bouquié, Régis; Dailly, Eric; Bentué-Ferrer, Danièle

    2010-01-01

    Oxcarbazepine is an analogue of carbamazepine, used for the treatment of partial seizure with or without secondary generalization. The two forms R and S of the mono-hydroxylated derivatives (MHD) are responsible for most of the anti-convulsant activity and it is the concentrations of MHD that are relevant in therapeutic drug monitoring (TDM). Analysis of currently literature provides no well-established relationship between plasma concentration of MHD and efficiency or toxicity. Although there is not a validated therapeutic range, the residual concentrations of usually observed therapeutic MHD are situated between 12 and 30 mg/L. In certain pathological or physiological circumstances, the pharmacokinetic variability of the oxcarbazepine can be considerable, but this strong unpredictability does not nevertheless justify the TDM of the MHD. Based on the available evidence, TDM of MHD is not routinely warranted but may be possibly useful in specific situations such as pregnancy or renal insufficiency.

  11. Oligonucleotide conjugates for therapeutic applications

    PubMed Central

    Winkler, Johannes

    2013-01-01

    Insufficient pharmacokinetic properties and poor cellular uptake are the main hurdles for successful therapeutic development of oligonucleotide agents. The covalent attachment of various ligands designed to influence the biodistribution and cellular uptake or for targeting specific tissues is an attractive possibility to advance therapeutic applications and to expand development options. In contrast to advanced formulations, which often consist of multiple reagents and are sensitive to a variety of preparation conditions, oligonucleotide conjugates are defined molecules, enabling structure-based analytics and quality control techniques. This review gives an overview of current developments of oligonucleotide conjugates for therapeutic applications. Attached ligands comprise peptides, proteins, carbohydrates, aptamers and small molecules, including cholesterol, tocopherol and folic acid. Important linkage types and conjugation methods are summarized. The distinct ligands directly influence biochemical parameters, uptake machanisms and pharmacokinetic properties. PMID:23883124

  12. Therapeutic and recreational methadone cardiotoxicity.

    PubMed

    Lusetti, Monia; Licata, Manuela; Silingardi, Enrico; Reggiani Bonetti, Luca; Palmiere, Cristian

    2016-04-01

    Several classes of drugs have been associated with an increased risk of cardiovascular disease and occurrence of arrhythmias potentially involved in sudden deaths in chronic users even at therapeutic doses. The study presented herein focuses on pathological changes involving the heart possibly due to methadone use. 60 cases were included in the study in total and were divided into three groups (therapeutic methadone users: 20 cases, recreational methadone users: 20 cases, and sudden death group in subjects who had never taken methadone: 20 cases). Autopsies, histology, biochemistry and toxicology were performed in all cases. Macroscopic and microscopic investigation results in therapeutic methadone users were similar to those observed in sudden, unexpected deaths in non-methadone users. In recreational methadone consumers, macroscopic and microscopic examination of the heart failed to provide results consistent with acute or chronic myocardial or coronary damage, thereby corroborating the hypothesis of death most likely following respiratory depression.

  13. Potential therapeutic applications of biosurfactants.

    PubMed

    Gudiña, Eduardo J; Rangarajan, Vivek; Sen, Ramkrishna; Rodrigues, Lígia R

    2013-12-01

    Biosurfactants have recently emerged as promising molecules for their structural novelty, versatility, and diverse properties that are potentially useful for many therapeutic applications. Mainly due to their surface activity, these molecules interact with cell membranes of several organisms and/or with the surrounding environments, and thus can be viewed as potential cancer therapeutics or as constituents of drug delivery systems. Some types of microbial surfactants, such as lipopeptides and glycolipids, have been shown to selectively inhibit the proliferation of cancer cells and to disrupt cell membranes causing their lysis through apoptosis pathways. Moreover, biosurfactants as drug delivery vehicles offer commercially attractive and scientifically novel applications. This review covers the current state-of-the-art in biosurfactant research for therapeutic purposes, providing new directions towards the discovery and development of molecules with novel structures and diverse functions for advanced applications.

  14. Therapeutic cloning and reproductive liberty.

    PubMed

    Sparrow, Robert

    2009-04-01

    Concern for "reproductive liberty" suggests that decisions about embryos should normally be made by the persons who would be the genetic parents of the child that would be brought into existence if the embryo were brought to term. Therapeutic cloning would involve creating and destroying an embryo, which, if brought to term, would be the offspring of the genetic parents of the person undergoing therapy. I argue that central arguments in debates about parenthood and genetics therefore suggest that therapeutic cloning would be prima facie unethical unless it occurred with the consent of the parents of the person being cloned. Alternatively, if therapeutic cloning is thought to be legitimate, this undermines the case for some uses of reproductive cloning by implying that the genetic relation it establishes between clones and DNA donors does not carry the same moral weight as it does in cases of normal reproduction.

  15. Oligonucleotide conjugates for therapeutic applications.

    PubMed

    Winkler, Johannes

    2013-07-01

    Insufficient pharmacokinetic properties and poor cellular uptake are the main hurdles for successful therapeutic development of oligonucleotide agents. The covalent attachment of various ligands designed to influence the biodistribution and cellular uptake or for targeting specific tissues is an attractive possibility to advance therapeutic applications and to expand development options. In contrast to advanced formulations, which often consist of multiple reagents and are sensitive to a variety of preparation conditions, oligonucleotide conjugates are defined molecules, enabling structure-based analytics and quality control techniques. This review gives an overview of current developments of oligonucleotide conjugates for therapeutic applications. Attached ligands comprise peptides, proteins, carbohydrates, aptamers and small molecules, including cholesterol, tocopherol and folic acid. Important linkage types and conjugation methods are summarized. The distinct ligands directly influence biochemical parameters, uptake mechanisms and pharmacokinetic properties.

  16. Improving chemoradiotherapy with nanoparticle therapeutics

    PubMed Central

    Eblan, Michael Joseph; Wang, Andrew Zhuang

    2014-01-01

    Chemoradiotherapy has been a key treatment paradigm in cancer management. One of the main research objectives in cancer research has been to identify agents and strategies to improve the therapeutic index of chemoradiation. Recent development of nanoparticle (NP)-based chemotherapeutics offers a unique opportunity to improve the delivery of chemotherapy, which can in turn improve chemoradiotherapy’s efficacy while lowering toxicity. NP-based chemotherapeutics also possess several characteristics that are well suited for chemoradiotherapy. Therefore, NP chemotherapeutics hold high potential in improving the therapeutic index of chemoradiotherapy. This manuscript reviews the NP properties that are favorable for chemoradiation and the rationale to utilize nanotherapeutics in chemoradiation. This review also discusses the preclinical and clinical data on using NP therapeutics in chemoradiotherapy. PMID:25429359

  17. Two concepts of therapeutic optimism

    PubMed Central

    Jansen, Lynn A

    2011-01-01

    Researchers and ethicists have long been concerned about the expectations for direct medical benefit expressed by participants in early phase clinical trials. Early work on the issue considered the possibility that participants misunderstand the purpose of clinical research or that they are misinformed about the prospects for medical benefit from these trials. Recently, however, attention has turned to the possibility that research participants are simply expressing optimism or hope about their participation in these trials. The ethical significance of this therapeutic optimism remains unclear. This paper argues that there are two distinct phenomena that can be associated with the term ‘therapeutic optimism’—one is ethically benign and the other is potentially worrisome. Distinguishing these two phenomena is crucial for understanding the nature and ethical significance of therapeutic optimism. The failure to draw a distinction between these phenomena also helps to explain why different writers on the topic often speak past one another. PMID:21551464

  18. Therapeutic target database update 2014: a resource for targeted therapeutics.

    PubMed

    Qin, Chu; Zhang, Cheng; Zhu, Feng; Xu, Feng; Chen, Shang Ying; Zhang, Peng; Li, Ying Hong; Yang, Sheng Yong; Wei, Yu Quan; Tao, Lin; Chen, Yu Zong

    2014-01-01

    Here we describe an update of the Therapeutic Target Database (http://bidd.nus.edu.sg/group/ttd/ttd.asp) for better serving the bench-to-clinic communities and for enabling more convenient data access, processing and exchange. Extensive efforts from the research, industry, clinical, regulatory and management communities have been collectively directed at the discovery, investigation, application, monitoring and management of targeted therapeutics. Increasing efforts have been directed at the development of stratified and personalized medicines. These efforts may be facilitated by the knowledge of the efficacy targets and biomarkers of targeted therapeutics. Therefore, we added search tools for using the International Classification of Disease ICD-10-CM and ICD-9-CM codes to retrieve the target, biomarker and drug information (currently enabling the search of almost 900 targets, 1800 biomarkers and 6000 drugs related to 900 disease conditions). We added information of almost 1800 biomarkers for 300 disease conditions and 200 drug scaffolds for 700 drugs. We significantly expanded Therapeutic Target Database data contents to cover >2300 targets (388 successful and 461 clinical trial targets), 20 600 drugs (2003 approved and 3147 clinical trial drugs), 20,000 multitarget agents against almost 400 target-pairs and the activity data of 1400 agents against 300 cell lines.

  19. A kinetic model-based algorithm to classify NGS short reads by their allele origin.

    PubMed

    Marinoni, Andrea; Rizzo, Ettore; Limongelli, Ivan; Gamba, Paolo; Bellazzi, Riccardo

    2015-02-01

    Genotyping Next Generation Sequencing (NGS) data of a diploid genome aims to assign the zygosity of identified variants through comparison with a reference genome. Current methods typically employ probabilistic models that rely on the pileup of bases at each locus and on a priori knowledge. We present a new algorithm, called Kimimila (KInetic Modeling based on InforMation theory to Infer Labels of Alleles), which is able to assign reads to alleles by using a distance geometry approach and to infer the variant genotypes accurately, without any kind of assumption. The performance of the model has been assessed on simulated and real data of the 1000 Genomes Project and the results have been compared with several commonly used genotyping methods, i.e., GATK, Samtools, VarScan, FreeBayes and Atlas2. Despite our algorithm does not make use of a priori knowledge, the percentage of correctly genotyped variants is comparable to these algorithms. Furthermore, our method allows the user to split the reads pool depending on the inferred allele origin.

  20. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  1. Toward a Model-Based Approach to Flight System Fault Protection

    NASA Technical Reports Server (NTRS)

    Day, John; Murray, Alex; Meakin, Peter

    2012-01-01

    Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.

  2. Model-Based Diagnosis and Prognosis of a Water Recycling System

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Hafiychuk, Vasyl; Goebel, Kai Frank

    2013-01-01

    A water recycling system (WRS) deployed at NASA Ames Research Center s Sustainability Base (an energy efficient office building that integrates some novel technologies developed for space applications) will serve as a testbed for long duration testing of next generation spacecraft water recycling systems for future human spaceflight missions. This system cleans graywater (waste water collected from sinks and showers) and recycles it into clean water. Like all engineered systems, the WRS is prone to standard degradation due to regular use, as well as other faults. Diagnostic and prognostic applications will be deployed on the WRS to ensure its safe, efficient, and correct operation. The diagnostic and prognostic results can be used to enable condition-based maintenance to avoid unplanned outages, and perhaps extend the useful life of the WRS. Diagnosis involves detecting when a fault occurs, isolating the root cause of the fault, and identifying the extent of damage. Prognosis involves predicting when the system will reach its end of life irrespective of whether an abnormal condition is present or not. In this paper, first, we develop a physics model of both nominal and faulty system behavior of the WRS. Then, we apply an integrated model-based diagnosis and prognosis framework to the simulation model of the WRS for several different fault scenarios to detect, isolate, and identify faults, and predict the end of life in each fault scenario, and present the experimental results.

  3. Statistical and neural network classifiers in model-based 3-D object recognition

    NASA Astrophysics Data System (ADS)

    Newton, Scott C.; Nutter, Brian S.; Mitra, Sunanda

    1991-02-01

    For autonomous machines equipped with vision capabilities and in a controlled environment 3-D model-based object identification methodologies will in general solve rigid body recognition problems. In an uncontrolled environment however several factors pose difficulties for correct identification. We have addressed the problem of 3-D object recognition using a number of methods including neural network classifiers and a Bayesian-like classifier for matching image data with model projection-derived data [1 21. Neural network classifiers used began operation as simple feature vector classifiers. However unmodelled signal behavior was learned with additional samples yielding great improvement in classification rates. The model analysis drastically shortened training time of both classification systems. In an environment where signal behavior is not accurately modelled two separate forms of learning give the systems the ability to update estimates of this behavior. Required of course are sufficient samples to learn this new information. Given sufficient information and a well-controlled environment identification of 3-D objects from a limited number of classes is indeed possible. 1.

  4. An approach to model validation and model-based prediction -- polyurethane foam case study.

    SciTech Connect

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are

  5. Therapeutics for Equine Endocrine Disorders.

    PubMed

    Durham, Andy E

    2017-02-09

    Equine endocrine disease is commonly encountered by equine practitioners. Pituitary pars intermedia dysfunction (PPID) and equine metabolic syndrome (EMS) predominate. The most logical therapeutic approach in PPID uses dopamine agonists; pergolide mesylate is the most common. Bromocryptine and cabergoline are alternative drugs with similar actions. Drugs from other classes have a poor evidence basis, although cyproheptadine and trilostane might be considered. EMS requires management changes as the primary approach; reasonable justification for use of drugs such as levothyroxine and metformin may apply. Therapeutic options exist in rare cases of diabetes mellitus, diabetes insipidus, hyperthyroidism, and critical illness-related corticosteroid insufficiency.

  6. [Cerebral oedema: new therapeutic ways].

    PubMed

    Quintard, H; Ichai, C

    2014-06-01

    Cerebral oedema (CO) after brain injury can occur from different ways. The vasogenic and cytotoxic oedema are usually described but osmotic and hydrostatic CO, respectively secondary to plasmatic hypotonia or increase in blood pressure, can also be encountered. Addition of these several mechanisms can worsen injuries. Consequences are major, leading quickly to death secondary to intracerebral hypertension and later to neuropsychic sequelae. So therapeutic care to control this phenomenon is essential and osmotherapy is actually the only way. A better understanding of physiopathological disorders, particularly energetic ways (lactate), aquaporine function, inflammation lead to new therapeutic hopes. The promising experimental results need now to be confirmed by clinical data.

  7. Freud, transference, and therapeutic action.

    PubMed

    Abend, Sander M

    2009-07-01

    The author traces the development of Freud's conception of the nature and significance of transference in the psychoanalytic process. He notes that from 1910 onward, Freud was convinced that the analysis of the transference is the sole factor involved in the therapeutic action of psychoanalytic treatment, despite the fact that, late in his career, he observed and described the power of reconstruction to be effective as well. The author agrees with those analysts who contend that, while the analysis of the transference is essential to proper analytic technique, it is not the only agent of therapeutic impact.

  8. Multichannel error correction code decoder

    NASA Technical Reports Server (NTRS)

    Wagner, Paul K.; Ivancic, William D.

    1993-01-01

    A brief overview of a processing satellite for a mesh very-small-aperture (VSAT) communications network is provided. The multichannel error correction code (ECC) decoder system, the uplink signal generation and link simulation equipment, and the time-shared decoder are described. The testing is discussed. Applications of the time-shared decoder are recommended.

  9. "Free Speech" and "Political Correctness"

    ERIC Educational Resources Information Center

    Scott, Peter

    2016-01-01

    "Free speech" and "political correctness" are best seen not as opposing principles, but as part of a spectrum. Rather than attempting to establish some absolute principles, this essay identifies four trends that impact on this debate: (1) there are, and always have been, legitimate debates about the--absolute--beneficence of…

  10. The Politics of Political Correctness.

    ERIC Educational Resources Information Center

    Minsky, Leonard

    1992-01-01

    This article reacts to President Bush's entry into the dispute over "political correctness" on college campuses. The paper summarizes discussions of students, faculty, and others in the Washington, D.C. area which concluded that this seeming defense of free speech is actually an attack on affirmative action and multiculturalism stemming…

  11. Political Correctness and American Academe.

    ERIC Educational Resources Information Center

    Drucker, Peter F.

    1994-01-01

    Argues that today's political correctness atmosphere is a throwback to attempts made by the Nazis and Stalinists to force society into conformity. Academia, it is claimed, is being forced to conform to gain control of the institution of higher education. It is predicted that this effort will fail. (GR)

  12. Professional Preparation in Adapted Physical Education Therapeutic Recreation and Corrective Therapy.

    ERIC Educational Resources Information Center

    American Alliance for Health, Physical Education, and Recreation, Washington, DC.

    The second in a series of seven booklets provides information on professional training for personnel in physical education and recreation for handicapped persons. Reviewed is the state of the art, and considered in separate sections (each with an annotated bibliography) are places of employment, eudcational requirements and resource contacts for…

  13. Therapeutic Recreation Education: 1999 Survey.

    ERIC Educational Resources Information Center

    Anderson, Stephen C.; Ashton-Shaeffer, Candace; Autry, Cari E.

    2000-01-01

    Examines the current status of therapeutic recreation education, documenting changes have that occurred over 30 years. Using data from surveys conducted every 10 years beginning in 1969, the study provides information on trends in programs, faculty, students, curriculum accreditation, and professional certification in programs in the United States…

  14. Therapeutic Recreation Majors' Work Preference.

    ERIC Educational Resources Information Center

    Barber, Elizabeth H.; Magafas, Anita

    1992-01-01

    Investigates the client age/disability work preference of 76 therapeutic recreation undergraduate students at 3 universities. Results indicate a preference to work with younger clients, disability groups, and physically impaired clients. Chronically ill clients were last in work preference. Students need exposure to the benefits of working with…

  15. Novel therapeutic approaches for haemophilia.

    PubMed

    Shetty, S; Ghosh, K

    2015-03-01

    The major therapy for haemophilia is plasma derived or recombinant clotting factors which are evolving steadily to increase potency, stability and half-life. Research in the area of haemophilia therapeutics, however, is not restricted only to modifications in the recombinant products, but alternate therapeutic strategies are being developed which are in different phases of experimental and clinical trials. This chapter reviews the diverse molecular innovations which are being developed for alternate therapeutic approaches in haemophilia. The data is mainly extracted from the literature and the Conference abstracts. Some of the novel therapeutic approaches include inhibition of anticoagulant pathway factors (activated protein C, antithrombin, tissue factor pathway inhibitor) by monoclonal antibodies, peptide inhibitors, DNA or RNA aptamers, use of variant coagulation factors (factor Xa, factor Va) which are more resistant to inactivation or enzymatically more active and antibody-mediated therapy including a humanized anti-factor IXa/X bispecific antibody mimicking factor VIII. Other approaches include nonsense mutation suppression, induction of prothrombotic microparticles by P-selectin-immunoglobulin chimeras, suppression of fibrinolytic potential either by antifibrinolytics or by the use of mutant molecules of fibrinolytic inhibitors. Few products are proposed as 'stand alone' treatment for haemophilia, while a few can be used as adjuvant therapies to recombinant factors with an aim to reduce the amount of factor intake. All efforts are underway to produce an alternate, novel drug for haemophilia which will have an increased half-life, subcutaneously injectable, non-immunogenic and effective both in the presence and absence of inhibitors.

  16. [Therapeutic drug monitoring of olanzapine].

    PubMed

    Djerada, Zoubir; Brousse, Georges; Niel, Philippe; Llorca, Pierre-Michel; Eschalier, Alain; Bentue-Ferrer, Danièle; Libert, Fréderic

    2016-10-25

    Olanzapine, atypical antipsychotic, is used to treat schizophrenia and bipolar disorder. Its therapeutic drug monitoring (TDM) is quite commonly done. Olanzapine is well absorbed orally (bioavailability: 85 %), with peak plasma occurring between 4 and 6hours after oral administration. It is extensively metabolized by different hepatic enzymes (including CYP1A2 and CYP2D6 isoforms) to a large number of inactive metabolites, and its half-life is between 30 and 60hours. No specific therapeutic range, or threshold concentration could not be a consensus, but the higher intra- and interindividual variability, as well as the existence of studies suggesting a correlation between circulating concentrations of olanzapine and occurrence of therapeutic relapse or toxic phenomena appear to justify the STP for this molecule. Given these data, the interest of the STP was evaluated for this molecule to: recommended with therapeutic window of 20μg/L to 80μg/L.

  17. Development of new RNAi therapeutics.

    PubMed

    Liu, G; Wong-Staal, F; Li, Q-X

    2007-02-01

    RNAi-mediated gene inactivation has become a cornerstone of the present day gene function studies that are the foundation of mechanism and target based drug discovery and development, which could potentially shorten the otherwise long process of drug development. In particular, the coming of age of "RNAi drug" could provide new promising therapeutics bypassing traditional approaches. However, there are technological hurdles need to overcome and the biological limitations need to consider for achieving effective therapeutics. Major hurdles include the intrinsic poor pharmacokinetic property of siRNA and major biological restrictions include off-target effects, interferon response and the interference with endogenous miRNA. Recent innovations in nucleic acid chemistry, formulations and delivery methods have gradually rendered it possible to develop effective RNAi-based therapeutics. Careful design based on the newest RNAi/miRNA biology can also help to minimize the potential tissue toxicity. If successful with systemic application, RNAi drug will no doubt revolutionize the whole drug development process. This review attempts to describe the progress in this area, including applications in preclinical models and recent favorable experience in a number of human trials of local diseases, along with the discussion on the potential limitations of RNAi therapeutics.

  18. Therapeutic role of dietary fibre.

    PubMed Central

    Hunt, R.; Fedorak, R.; Frohlich, J.; McLennan, C.; Pavilanis, A.

    1993-01-01

    The current status of dietary fibre and fibre supplements in health and disease is reported, and the components of dietary fibre and its respective mechanical and metabolic effects with emphasis on its therapeutic potential are reviewed. Practical management guidelines are provided to help physicians encourage patients identified as having fibre deficiency to increase dietary fibre intake to the recommended level. PMID:8388284

  19. Level 2 Therapeutic Model Site

    ERIC Educational Resources Information Center

    Spears, Brad; Sanchez, David; Bishop, Jane; Rogers, Sharon; DeJong, Judith A.

    2006-01-01

    L2, one of the original sites first funded under the Therapeutic Residential Model Initiative in 2001-2002, is operated as a peripheral dormitory This dormitory cares for 185 boys and girls in grades 1-12 who attend local public schools. L2 presented an outstanding proposal which identified gaps in services and presented a reasonable budget to…

  20. Therapy Talk: Analyzing Therapeutic Discourse

    ERIC Educational Resources Information Center

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  1. 76 FR 15212 - Operations Specifications; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-21

    ... Federal Aviation Administration 14 CFR Part 129 RIN 2120-AJ45 Operations Specifications; Correction AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final rule; correction. SUMMARY: The FAA is correcting... document. DATES: The final rule and this correction will become effective on April 11, 2011. FOR...

  2. 78 FR 63100 - Unified Registration System; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-23

    ... System; Correction AGENCY: Federal Motor Carrier Safety Administration (FMCSA), DOT. ACTION: Final rule; correction. SUMMARY: FMCSA makes corrections to its August 23, 2013, final rule regarding the Unified... Friday, August 23, 2013, the following corrections are made: Sec. 390.3 0 1. In Part 390--Federal...

  3. 76 FR 30254 - Taxpayer Assistance Orders; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ... 9519] RIN 1545-BF33 Taxpayer Assistance Orders; Correction AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Correction to final regulations. SUMMARY: This document contains a correction to final...) relating to taxpayer assistance orders. DATES: This correction is effective May 25, 2011 and...

  4. 24 CFR 1720.515 - Corrections.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Corrections. 1720.515 Section 1720... Proceedings Hearings § 1720.515 Corrections. Corrections of the official transcript ordered by the administrative law judge shall be included in the record. Corrections shall not be ordered by the...

  5. 5 CFR 1604.6 - Error correction.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Error correction. 1604.6 Section 1604.6... correction. (a) General rule. A service member's employing agency must correct the service member's account... corrections must be made in accordance with 5 CFR part 1605. (b) Missed bonus contributions. This paragraph...

  6. 5 CFR 1601.34 - Error correction.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Error correction. 1601.34 Section 1601.34... Contribution Allocations and Interfund Transfer Requests § 1601.34 Error correction. Errors in processing... in the wrong investment fund, will be corrected in accordance with the error correction...

  7. 5 CFR 1604.6 - Error correction.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Error correction. 1604.6 Section 1604.6... correction. (a) General rule. A service member's employing agency must correct the service member's account... corrections must be made in accordance with 5 CFR part 1605. (b) Missed bonus contributions. This paragraph...

  8. 24 CFR 1720.515 - Corrections.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Corrections. 1720.515 Section 1720... Proceedings Hearings § 1720.515 Corrections. Corrections of the official transcript ordered by the administrative law judge shall be included in the record. Corrections shall not be ordered by the...

  9. 5 CFR 1601.34 - Error correction.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Error correction. 1601.34 Section 1601.34... Contribution Allocations and Interfund Transfer Requests § 1601.34 Error correction. Errors in processing... in the wrong investment fund, will be corrected in accordance with the error correction...

  10. Internal Correction Of Errors In A DRAM

    NASA Technical Reports Server (NTRS)

    Zoutendyk, John A.; Watson, R. Kevin; Schwartz, Harvey R.; Nevill, Leland R.; Hasnain, Zille

    1989-01-01

    Error-correcting Hamming code built into circuit. A 256 K dynamic random-access memory (DRAM) circuit incorporates Hamming error-correcting code in its layout. Feature provides faster detection and correction of errors at less cost in amount of equipment, operating time, and software. On-chip error-correcting feature also makes new DRAM less susceptible to single-event upsets.

  11. 7 CFR 800.165 - Corrected certificates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... this process shall be corrected according to this section. (b) Who may correct. Only official personnel.... According to this section and the instructions, corrected certificates shall show (i) the terms “Corrected... that has been superseded by another certificate or on the basis of a subsequent analysis for...

  12. Effects of lump characteristics on plutonium self absorption correction methods

    SciTech Connect

    Curtis, D. C.; Wormald, M. R.; Croft, S.

    2007-07-01

    An evaluation study has been undertaken to assess the robustness of several published Pu self-absorption correction methods against variation in size, shape, density etc. for use in the gamma assay of nuclear waste. The correction methods studied are a numerical plutonium self absorption correction (PuSAC) technique, the Fleissner 2-line, Fleissner 3-line and Infinite Energy Extrapolation methods with both linear and polynomial extrapolation to 1/E=0. The performance of these methods has been compared for a limited set of measured encapsulated PuO{sub 2} sources plus a range of modelled unencapsulated Pu lumps. An indication of the magnitude of the uncertainties of the numerical PuSAC method has been determined for cases of blind assays where the Pu material, shape and distribution are unknown with the aim of ultimately applying it to real waste. The importance of the range of Pu lumps used in the baseline modelled dataset has been examined. Data are presented to illustrate how the uncertainties in the method are affected by the shape, composition, density, number and mass distribution of Pu particles in a sample for a given modelled base dataset. (authors)

  13. Correction for acoustic attenuation effects in optoacoustic tomographic reconstructions

    NASA Astrophysics Data System (ADS)

    Deán-Ben, X. Luís; Razansky, Daniel; Ntziachristos, Vasilis

    2011-07-01

    The feasibility of correcting for the effects of acoustic attenuation in optoacoustic tomographic reconstructions obtained with model-based inversion is shown in this work. Acoustic attenuation is a physical phenomenon that takes place inevitably in actual acoustic media and becomes significant at high ultrasonic frequencies. The frequency dependence of acoustic attenuation and the associated dispersion lead to reduction of amplitude and broadening of the optoacoustic signals, which in turn cause, respectively, quantification errors and loss of resolution in the reconstructed images. In this work we imaged an agar phantom with embedded microparticles in three different scenarios, namely with the signals acquired with no attenuation, with the signals collected by placing an attenuating sample in between the phantom and the ultrasonic transducer and with the signals corrected for the effects of acoustic attenuation. The results obtained show that the quantification inaccuracies and the loss of resolution of the images can be partially corrected at the expense of introducing noise at high spatial frequencies due to the amplification of the high frequency components of the noise in the signals.

  14. Model-based cartilage thickness measurement in the submillimeter range

    SciTech Connect

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-09-15

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  15. Minocycline Protects against Neurologic Complications of Rapid Correction of Hyponatremia

    PubMed Central

    Soupart, Alain; Pochet, Roland; Brion, Jean Pierre

    2010-01-01

    Osmotic demyelination syndrome is a devastating neurologic condition that occurs after rapid correction of serum sodium in patients with hyponatremia. Pathologic features of this injury include a well-demarcated region of myelin loss, a breakdown of the blood–brain barrier, and infiltration of microglia. The semisynthetic tetracycline minocycline is protective in some animal models of central nervous system injury, including demyelination, suggesting that it may also protect against demyelination resulting from rapid correction of chronic hyponatremia. Using a rat model of osmotic demyelination syndrome, we found that treatment with minocycline significantly decreases brain demyelination, alleviates neurologic manifestations, and reduces mortality associated with rapid correction of hyponatremia. Mechanistically, minocycline decreased the permeability of the blood–brain barrier, inhibited microglial activation, decreased both the expression of IL1α and protein nitrosylation, and reduced the loss of GFAP immunoreactivity. In conclusion, minocycline modifies the course of osmotic demyelination in rats, suggesting its possible therapeutic use in the setting of inadvertent rapid correction of chronic hyponatremia in humans. PMID:21051736

  16. Radial lens distortion correction with sub-pixel accuracy for X-ray micro-tomography.

    PubMed

    Vo, Nghia T; Atwood, Robert C; Drakopoulos, Michael

    2015-12-14

    Distortion correction or camera calibration for an imaging system which is highly configurable and requires frequent disassembly for maintenance or replacement of parts needs a speedy method for recalibration. Here we present direct techniques for calculating distortion parameters of a non-linear model based on the correct determination of the center of distortion. These techniques are fast, very easy to implement, and accurate at sub-pixel level. The implementation at the X-ray tomography system of the I12 beamline, Diamond Light Source, which strictly requires sub-pixel accuracy, shows excellent performance in the calibration image and in the reconstructed images.

  17. Therapeutic activity of modified U1 core spliceosomal particles

    PubMed Central

    Rogalska, Malgorzata Ewa; Tajnik, Mojca; Licastro, Danilo; Bussani, Erica; Camparini, Luca; Mattioli, Chiara; Pagani, Franco

    2016-01-01

    Modified U1 snRNAs bound to intronic sequences downstream of the 5′ splice site correct exon skipping caused by different types of mutations. Here we evaluate the therapeutic activity and structural requirements of these exon-specific U1 snRNA (ExSpeU1) particles. In a severe spinal muscular atrophy, mouse model, ExSpeU1, introduced by germline transgenesis, increases SMN2 exon 7 inclusion, SMN protein production and extends life span. In vitro, RNA mutant analysis and silencing experiments show that while U1A protein is dispensable, the 70K and stem loop IV elements mediate most of the splicing rescue activity through improvement of exon and intron definition. Our findings indicate that precise engineering of the U1 core spliceosomal RNA particle has therapeutic potential in pathologies associated with exon-skipping mutations. PMID:27041075

  18. New Innovations: Therapeutic opportunities for Intellectual disabilities

    PubMed Central

    Picker, Jonathan D.; Walsh, Christopher A.

    2013-01-01

    Intellectual disability is common and is associated with significant morbidity. Until the latter half of the twentieth century there were no efficacious treatments. Following initial breakthroughs associated with newborn screening and metabolic corrections, little progress was made until recently. With improved understanding of genetic and cellular mechanisms, novel treatment options are beginning to appear for a number of specific conditions. Fragile X and Tuberous sclerosis offer paradigms for the development of targeted therapeutics but advances in understanding of other disorders such as Down syndrome and Rett syndrome for example are also resulting in promising treatment directions. In addition, better understanding of the underlying neurobiology is leading to novel developments in enzyme replacement for storage disorders, adjunctive therapies for metabolic disorders; as well as potentially more generalizable approaches that target dysfunctional cell regulation via RNA and chromatin. Physiologic therapies, including deep brain stimulation and transcranial magnetic stimulation, offer yet another direction to enhance cognitive functioning. Current options and evolving opportunities for the intellectually disabled are reviewed and exemplified. Global significance of intellectual disability PMID:24038210

  19. Diagnostic and therapeutic management of hepatocellular carcinoma

    PubMed Central

    Bellissimo, Francesco; Pinzone, Marilia Rita; Cacopardo, Bruno; Nunnari, Giuseppe

    2015-01-01

    Hepatocellular carcinoma (HCC) is an increasing health problem, representing the second cause of cancer-related mortality worldwide. The major risk factor for HCC is cirrhosis. In developing countries, viral hepatitis represent the major risk factor, whereas in developed countries, the epidemic of obesity, diabetes and nonalcoholic steatohepatitis contribute to the observed increase in HCC incidence. Cirrhotic patients are recommended to undergo HCC surveillance by abdominal ultrasounds at 6-mo intervals. The current diagnostic algorithms for HCC rely on typical radiological hallmarks in dynamic contrast-enhanced imaging, while the use of α-fetoprotein as an independent tool for HCC surveillance is not recommended by current guidelines due to its low sensitivity and specificity. Early diagnosis is crucial for curative treatments. Surgical resection, radiofrequency ablation and liver transplantation are considered the cornerstones of curative therapy, while for patients with more advanced HCC recommended options include sorafenib and trans-arterial chemo-embolization. A multidisciplinary team, consisting of hepatologists, surgeons, radiologists, oncologists and pathologists, is fundamental for a correct management. In this paper, we review the diagnostic and therapeutic management of HCC, with a focus on the most recent evidences and recommendations from guidelines. PMID:26576088

  20. Model based iterative reconstruction for Bright Field electron tomography

    NASA Astrophysics Data System (ADS)

    Venkatakrishnan, Singanallur V.; Drummy, Lawrence F.; De Graef, Marc; Simmons, Jeff P.; Bouman, Charles A.

    2013-02-01

    Bright Field (BF) electron tomography (ET) has been widely used in the life sciences to characterize biological specimens in 3D. While BF-ET is the dominant modality in the life sciences it has been generally avoided in the physical sciences due to anomalous measurements in the data due to a phenomenon called "Bragg scatter" - visible when crystalline samples are imaged. These measurements cause undesirable artifacts in the reconstruction when the typical algorithms such as Filtered Back Projection (FBP) and Simultaneous Iterative Reconstruction Technique (SIRT) are applied to the data. Model based iterative reconstruction (MBIR) provides a powerful framework for tomographic reconstruction that incorporates a model for data acquisition, noise in the measurement and a model for the object to obtain reconstructions that are qualitatively superior and quantitatively accurate. In this paper we present a novel MBIR algorithm for BF-ET which accounts for the presence of anomalous measurements from Bragg scatter in the data during the iterative reconstruction. Our method accounts for the anomalies by formulating the reconstruction as minimizing a cost function which rejects measurements that deviate significantly from the typical Beer's law model widely assumed for BF-ET. Results on simulated as well as real data show that our method can dramatically improve the reconstructions compared to FBP and MBIR without anomaly rejection, suppressing the artifacts due to the Bragg anomalies.

  1. Model-based optimal planning of hepatic radiofrequency ablation.

    PubMed

    Chen, Qiyong; Müftü, Sinan; Meral, Faik Can; Tuncali, Kemal; Akçakaya, Murat

    2016-07-19

    This article presents a model-based pre-treatment optimal planning framework for hepatic tumour radiofrequency (RF) ablation. Conventional hepatic radiofrequency (RF) ablation methods rely on pre-specified input voltage and treatment length based on the tumour size. Using these experimentally obtained pre-specified treatment parameters in RF ablation is not optimal to achieve the expected level of cell death and usually results in more healthy tissue damage than desired. In this study we present a pre-treatment planning framework that provides tools to control the levels of both the healthy tissue preservation and tumour cell death. Over the geometry of tumour and surrounding tissue, we formulate the RF ablation planning as a constrained optimization problem. With specific constraints over the temperature profile (TP) in pre-determined areas of the target geometry, we consider two different cost functions based on the history of the TP and Arrhenius index (AI) of the target location, respectively. We optimally compute the input voltage variation to minimize the damage to the healthy tissue while ensuring a complete cell death in the tumour and immediate area covering the tumour. As an example, we use a simulation of a 1D symmetric target geometry mimicking the application of single electrode RF probe. Results demonstrate that compared to the conventional methods both cost functions improve the healthy tissue preservation.

  2. Tyre pressure monitoring using a dynamical model-based estimator

    NASA Astrophysics Data System (ADS)

    Reina, Giulio; Gentile, Angelo; Messina, Arcangelo

    2015-04-01

    In the last few years, various control systems have been investigated in the automotive field with the aim of increasing the level of safety and stability, avoid roll-over, and customise handling characteristics. One critical issue connected with their integration is the lack of state and parameter information. As an example, vehicle handling depends to a large extent on tyre inflation pressure. When inflation pressure drops, handling and comfort performance generally deteriorate. In addition, it results in an increase in fuel consumption and in a decrease in lifetime. Therefore, it is important to keep tyres within the normal inflation pressure range. This paper introduces a model-based approach to estimate online tyre inflation pressure. First, basic vertical dynamic modelling of the vehicle is discussed. Then, a parameter estimation framework for dynamic analysis is presented. Several important vehicle parameters including tyre inflation pressure can be estimated using the estimated states. This method aims to work during normal driving using information from standard sensors only. On the one hand, the driver is informed about the inflation pressure and he is warned for sudden changes. On the other hand, accurate estimation of the vehicle states is available as possible input to onboard control systems.

  3. A consensus opinion model based on the evolutionary game

    NASA Astrophysics Data System (ADS)

    Yang, Han-Xin

    2016-08-01

    We propose a consensus opinion model based on the evolutionary game. In our model, both of the two connected agents receive a benefit if they have the same opinion, otherwise they both pay a cost. Agents update their opinions by comparing payoffs with neighbors. The opinion of an agent with higher payoff is more likely to be imitated. We apply this model in scale-free networks with tunable degree distribution. Interestingly, we find that there exists an optimal ratio of cost to benefit, leading to the shortest consensus time. Qualitative analysis is obtained by examining the evolution of the opinion clusters. Moreover, we find that the consensus time decreases as the average degree of the network increases, but increases with the noise introduced to permit irrational choices. The dependence of the consensus time on the network size is found to be a power-law form. For small or larger ratio of cost to benefit, the consensus time decreases as the degree exponent increases. However, for moderate ratio of cost to benefit, the consensus time increases with the degree exponent. Our results may provide new insights into opinion dynamics driven by the evolutionary game theory.

  4. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  5. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  6. Model-Based Predictive Control of Turbulent Channel Flow

    NASA Astrophysics Data System (ADS)

    Kellogg, Steven M.; Collis, S. Scott

    1999-11-01

    In recent simulations of optimal turbulence control, the time horizon over which the control is determined matches the time horizon over which the flow is advanced. A popular workhorse of the controls community, Model-Based Predictive Control (MBPC), suggests using longer predictive horizons than advancement windows. Including additional time information in the optimization may generate improved controls. When the advancement horizon is smaller than the predictive horizon, part of the optimization and resulting control are discarded. Although this inherent inefficiency may be justified by improved control predictions, it has hampered prior investigations of MBPC for turbulent flow due to the expense associated with optimal control based on Direct Numerical Simulation. The current approach overcomes this by using our optimal control formulation based on Large Eddy Simulation. This presentation summarizes the results of optimal control simulations for turbulent channel flow using various ratios of advancement and predictive horizons. These results provide clues as to the roles of foresight, control history, cost functional, and turbulence structures for optimal control of wall-bounded turbulence.

  7. Model-Based Reasoning in Upper-division Lab Courses

    NASA Astrophysics Data System (ADS)

    Lewandowski, Heather

    2015-05-01

    Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.

  8. Model-based quantitative laser Doppler flowmetry in skin

    NASA Astrophysics Data System (ADS)

    Fredriksson, Ingemar; Larsson, Marcus; Strömberg, Tomas

    2010-09-01

    Laser Doppler flowmetry (LDF) can be used for assessing the microcirculatory perfusion. However, conventional LDF (cLDF) gives only a relative perfusion estimate for an unknown measurement volume, with no information about the blood flow speed distribution. To overcome these limitations, a model-based analysis method for quantitative LDF (qLDF) is proposed. The method uses inverse Monte Carlo technique with an adaptive three-layer skin model. By analyzing the optimal model where measured and simulated LDF spectra detected at two different source-detector separations match, the absolute microcirculatory perfusion for a specified speed region in a predefined volume is determined. qLDF displayed errors <12% when evaluated using simulations of physiologically relevant variations in the layer structure, in the optical properties of static tissue, and in blood absorption. Inhomogeneous models containing small blood vessels, hair, and sweat glands displayed errors <5%. Evaluation models containing single larger blood vessels displayed significant errors but could be dismissed by residual analysis. In vivo measurements using local heat provocation displayed a higher perfusion increase with qLDF than cLDF, due to nonlinear effects in the latter. The qLDF showed that the perfusion increase occurred due to an increased amount of red blood cells with a speed >1 mm/s.

  9. Propagating uncertainties in statistical model based shape prediction

    NASA Astrophysics Data System (ADS)

    Syrkina, Ekaterina; Blanc, Rémi; Székely, Gàbor

    2011-03-01

    This paper addresses the question of accuracy assessment and confidence regions estimation in statistical model based shape prediction. Shape prediction consists in estimating the shape of an organ based on a partial observation, due e.g. to a limited field of view or poorly contrasted images, and generally requires a statistical model. However, such predictions can be impaired by several sources of uncertainty, in particular the presence of noise in the observation, limited correlations between the predictors and the shape to predict, as well as limitations of the statistical shape model - in particular the number of training samples. We propose a framework which takes these into account and derives confidence regions around the predicted shape. Our method relies on the construction of two separate statistical shape models, for the predictors and for the unseen parts, and exploits the correlations between them assuming a joint Gaussian distribution. Limitations of the models are taken into account by jointly optimizing the prediction and minimizing the shape reconstruction error through cross-validation. An application to the prediction of the shape of the proximal part of the human tibia given the shape of the distal femur is proposed, as well as the evaluation of the reliability of the estimated confidence regions, using a database of 184 samples. Potential applications are reconstructive surgery, e.g. to assess whether an implant fits in a range of acceptable shapes, or functional neurosurgery when the target's position is not directly visible and needs to be inferred from nearby visible structures.

  10. Model-Based Tomographic Reconstruction of Objects Containing Known Components

    PubMed Central

    Stayman, J. Webster; Otake, Yoshito; Prince, Jerry L.; Khanna, A. Jay; Siewerdsen, Jeffrey H.

    2015-01-01

    The likelihood of finding manufactured components (surgical tools, implants, etc.) within a tomographic field-of-view has been steadily increasing. One reason is the aging population and proliferation of prosthetic devices, such that more people undergoing diagnostic imaging have existing implants, particularly hip and knee implants. Another reason is that use of intraoperative imaging (e.g., cone-beam CT) for surgical guidance is increasing, wherein surgical tools and devices such as screws and plates are placed within or near to the target anatomy. When these components contain metal, the reconstructed volumes are likely to contain severe artifacts that adversely affect the image quality in tissues both near and far from the component. Because physical models of such components exist, there is a unique opportunity to integrate this knowledge into the reconstruction algorithm to reduce these artifacts. We present a model-based penalized-likelihood estimation approach that explicitly incorporates known information about component geometry and composition. The approach uses an alternating maximization method that jointly estimates the anatomy and the position and pose of each of the known components. We demonstrate that the proposed method can produce nearly artifact-free images even near the boundary of a metal implant in simulated vertebral pedicle screw reconstructions and even under conditions of substantial photon starvation. The simultaneous estimation of device pose also provides quantitative information on device placement that could be valuable to quality assurance and verification of treatment delivery. PMID:22614574

  11. Model-based source localization of extracellular action potentials.

    PubMed

    Somogyvári, Zoltán; Zalányi, László; Ulbert, István; Erdi, Péter

    2005-09-30

    A new model-based analysis method was set up for revealing information encrypted in extracellular spatial potential patterns of neocortical action potentials. Spikes were measured by extracellular linear multiple microelectrode in vivo cat's primary auditory cortex and were analyzed based on current source density (CSD) distribution models. Validity of the monopole and other point source approximations were tested on the measured potential patterns by numerical fitting. We have found, that point source models could not provide accurate description of the measured patterns. We introduced a new model of the CSD distribution on a spiking cell, called counter-current model (CCM). This new model was shown to provide better description of the spatial current distribution of the cell during the initial negative deflection of the extracellular action potential, from the onset of the spike to the negative peak. The new model was tested on simulated extracellular potentials. We proved numerically, that all the parameters of the model could be determined accurately based on measurements. Thus, fitting of the CCM allowed extraction of these parameters from the measurements. Due to model fitting, CSD could be calculated with much higher accuracy as done with the traditional method because distance dependence of the spatial potential patterns was explicitly taken into consideration in our method. Average CSD distribution of the neocortical action potentials was calculated and spatial decay constant of the dendritic trees was determined by applying our new method.

  12. Model-Based Visual Self-localization Using Gaussian Spheres

    NASA Astrophysics Data System (ADS)

    Gonzalez-Aguirre, David; Asfour, Tamim; Bayro-Corrochano, Eduardo; Dillmann, Ruediger

    A novel model-based approach for global self-localization using active stereo vision and density Gaussian spheres is presented. The proposed object recognition components deliver noisy percept subgraphs, which are filtered and fused into an ego-centered reference frame. In subsequent stages, the required vision-to-model associations are extracted by selecting ego-percept subsets in order to prune and match the corresponding world-model subgraph. Ideally, these coupled subgraphs hold necessary information to obtain the model-to-world transformation, i.e., the pose of the robot. However, the estimation of the pose is not robust due to the uncertainties introduced when recovering Euclidean metric from images and during the mapping from the camera to the ego-center. The approach models the uncertainty of the percepts with a radial normal distribution. This formulation allows a closed-form solution which not only derives the maximal density position depicting the optimal ego-center but also ensures the solution even in situations where pure geometric spheres might not intersect.

  13. Nonlinear model-based method for clustering periodically expressed genes.

    PubMed

    Tian, Li-Ping; Liu, Li-Zhi; Zhang, Qian-Wei; Wu, Fang-Xiang

    2011-01-01

    Clustering periodically expressed genes from their time-course expression data could help understand the molecular mechanism of those biological processes. In this paper, we propose a nonlinear model-based clustering method for periodically expressed gene profiles. As periodically expressed genes are associated with periodic biological processes, the proposed method naturally assumes that a periodically expressed gene dataset is generated by a number of periodical processes. Each periodical process is modelled by a linear combination of trigonometric sine and cosine functions in time plus a Gaussian noise term. A two stage method is proposed to estimate the model parameter, and a relocation-iteration algorithm is employed to assign each gene to an appropriate cluster. A bootstrapping method and an average adjusted Rand index (AARI) are employed to measure the quality of clustering. One synthetic dataset and two biological datasets were employed to evaluate the performance of the proposed method. The results show that our method allows the better quality clustering than other clustering methods (e.g., k-means) for periodically expressed gene data, and thus it is an effective cluster analysis method for periodically expressed gene data.

  14. Toward dynamic model-based prognostics for transmission gears

    NASA Astrophysics Data System (ADS)

    Wang, Wenyi

    2002-07-01

    This paper presents a novel methodology for the diagnosis and prognosis of crucial gear faults, such as gear tooth fatigue cracking. Currently, an effective detection of tooth cracking can be achieved by using the autoregressive (AR) modeling approach, where the gear vibration signal is modeled by an AR model and gear tooth cracking is detected by identifying the sudden changes in the model's error signal. The model parameters can be estimated under the criteria of minimum power or maximum kurtosis of model errors. However, these model parameters possess no physical meaning about the monitored gear system. It is proposed that the AR model be replaced by a gear dynamics model (GDM) that contains physically meaningful parameters, such as mass, damping and stiffness. By identifying and tracking the changes in the parameters, it is possible to make diagnosis and prognosis of gear faults. For example, a reduction in mesh stiffness may indicate cracking of a gear tooth. Towards physical model-based prognosis, an adaptive (or optimization) strategy has been developed for approximating a gear signal using a simplified gear signal model. Preliminary results show that this strategy provides a feasible adaptive process for updating model parameters based on measured gear signal.

  15. Advanced electron crystallography through model-based imaging

    PubMed Central

    Van Aert, Sandra; De Backer, Annick; Martinez, Gerardo T.; den Dekker, Arnold J.; Van Dyck, Dirk; Bals, Sara; Van Tendeloo, Gustaaf

    2016-01-01

    The increasing need for precise determination of the atomic arrangement of non-periodic structures in materials design and the control of nanostructures explains the growing interest in quantitative transmission electron microscopy. The aim is to extract precise and accurate numbers for unknown structure parameters including atomic positions, chemical concentrations and atomic numbers. For this purpose, statistical parameter estimation theory has been shown to provide reliable results. In this theory, observations are considered purely as data planes, from which structure parameters have to be determined using a parametric model describing the images. As such, the positions of atom columns can be measured with a precision of the order of a few picometres, even though the resolution of the electron microscope is still one or two orders of magnitude larger. Moreover, small differences in average atomic number, which cannot be distinguished visually, can be quantified using high-angle annular dark-field scanning transmission electron microscopy images. In addition, this theory allows one to measure compositional changes at interfaces, to count atoms with single-atom sensitivity, and to reconstruct atomic structures in three dimensions. This feature article brings the reader up to date, summarizing the underlying theory and highlighting some of the recent applications of quantitative model-based transmisson electron microscopy. PMID:26870383

  16. Quantifying fatigue risk in model-based fatigue risk management.

    PubMed

    Rangan, Suresh; Van Dongen, Hans P A

    2013-02-01

    The question of what is a maximally acceptable level of fatigue risk is hotly debated in model-based fatigue risk management in commercial aviation and other transportation modes. A quantitative approach to addressing this issue, referred to by the Federal Aviation Administration with regard to its final rule for commercial aviation "Flightcrew Member Duty and Rest Requirements," is to compare predictions from a mathematical fatigue model against a fatigue threshold. While this accounts for duty time spent at elevated fatigue risk, it does not account for the degree of fatigue risk and may, therefore, result in misleading schedule assessments. We propose an alternative approach based on the first-order approximation that fatigue risk is proportional to both the duty time spent below the fatigue threshold and the distance of the fatigue predictions to the threshold--that is, the area under the curve (AUC). The AUC approach is straightforward to implement for schedule assessments in commercial aviation and also provides a useful fatigue metric for evaluating thousands of scheduling options in industrial schedule optimization tools.

  17. A nonlinear regression model-based predictive control algorithm.

    PubMed

    Dubay, R; Abu-Ayyad, M; Hernandez, J M

    2009-04-01

    This paper presents a unique approach for designing a nonlinear regression model-based predictive controller (NRPC) for single-input-single-output (SISO) and multi-input-multi-output (MIMO) processes that are common in industrial applications. The innovation of this strategy is that the controller structure allows nonlinear open-loop modeling to be conducted while closed-loop control is executed every sampling instant. Consequently, the system matrix is regenerated every sampling instant using a continuous function providing a more accurate prediction of the plant. Computer simulations are carried out on nonlinear plants, demonstrating that the new approach is easily implemented and provides tight control. Also, the proposed algorithm is implemented on two real time SISO applications; a DC motor, a plastic injection molding machine and a nonlinear MIMO thermal system comprising three temperature zones to be controlled with interacting effects. The experimental closed-loop responses of the proposed algorithm were compared to a multi-model dynamic matrix controller (MPC) with improved results for various set point trajectories. Good disturbance rejection was attained, resulting in improved tracking of multi-set point profiles in comparison to multi-model MPC.

  18. [Soil moisture estimation model based on multiple vegetation index].

    PubMed

    Wu, Hai-long; Yu, Xin-xiao; Zhang, Zhen-ming; Zhang, Yan

    2014-06-01

    Estimating soil moisture conveniently and exactly is a hot issues in water resource monitoring among agriculture and forestry. Estimating soil moisture based on vegetation index has been recognized and applied widely. 8 vegetation indexes were figured out based on the hyper-spectral data measured by portable spectrometer. The higher correlation indexes among 8 vegetation indexes and surface vegetation temperature were selected by Gray Relative Analysis method (GRA). Then, these selected indexes were analyzed using Multiple Linear Regression to establish soil moisture estimation model based on multiple vegetation indexes, and the model accuracy was evaluated. The accuracy evaluation indicated that the fitting was satisfied and the significance was 0.000 (P < 0.001). High correlation was turned out between estimated and measured soil moisture with R2 reached 0.636 1 and RMSE 2.149 9. This method introduced multiple vegetation indexes into soil water content estimating over micro scale by non-contact measuring method using portable spectrometer. The exact estimation could be an appropriate replacement for remote sensing inversion and direct measurement. The model could estimate soil moisture quickly and accurately, and provide theory and technology reference for water resource management in agriculture and forestry.

  19. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    PubMed

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  20. Model-based optimization of tapered free-electron lasers

    NASA Astrophysics Data System (ADS)

    Mak, Alan; Curbis, Francesca; Werin, Sverker

    2015-04-01

    The energy extraction efficiency is a figure of merit for a free-electron laser (FEL). It can be enhanced by the technique of undulator tapering, which enables the sustained growth of radiation power beyond the initial saturation point. In the development of a single-pass x-ray FEL, it is important to exploit the full potential of this technique and optimize the taper profile aw(z ). Our approach to the optimization is based on the theoretical model by Kroll, Morton, and Rosenbluth, whereby the taper profile aw(z ) is not a predetermined function (such as linear or exponential) but is determined by the physics of a resonant particle. For further enhancement of the energy extraction efficiency, we propose a modification to the model, which involves manipulations of the resonant particle's phase. Using the numerical simulation code GENESIS, we apply our model-based optimization methods to a case of the future FEL at the MAX IV Laboratory (Lund, Sweden), as well as a case of the LCLS-II facility (Stanford, USA).