Science.gov

Sample records for model-based therapeutic correction

  1. Model based scatter correction for cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Wiegert, Jens; Bertram, Matthias; Rose, Georg; Aach, Til

    2005-04-01

    Scattered radiation is a major source of image degradation and nonlinearity in flat detector based cone-beam CT. Due to the bigger irradiated volume the amount of scattered radiation in true cone-beam geometry is considerably higher than for fan beam CT. This on the one hand reduces the signal to noise ratio, since the additional scattered photons contribute only to the noise and not to the measured signal, and on the other hand cupping and streak artifacts arise in the reconstructed volume. Anti-scatter grids composed of lead lamellae and interspacing material decrease the SNR for flat detector based CB-CT geometry, because the beneficial scatter attenuating effect is overcompensated by the absorption of primary radiation. Additionally, due to the high amount of scatter that still remains behind the grid, cupping and streak artifacts cannot be reduced sufficiently. Computerized scatter correction schemes are therefore essential for achieving artifact-free reconstructed images in cone-beam CT. In this work, a fast model based scatter correction algorithm is proposed, aiming at accurately estimating the level and spatial distribution of scattered radiation background in each projection. This will allow for effectively reducing streak and cupping artifacts due to scattering in cone-beam CT applications.

  2. Evolutionary modeling-based approach for model errors correction

    NASA Astrophysics Data System (ADS)

    Wan, S. Q.; He, W. P.; Wang, L.; Jiang, W.; Zhang, W.

    2012-08-01

    The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data." On the basis of the intelligent features of evolutionary modeling (EM), including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  3. Model-based sensor-less wavefront aberration correction in optical coherence tomography.

    PubMed

    Verstraete, Hans R G W; Wahls, Sander; Kalkman, Jeroen; Verhaegen, Michel

    2015-12-15

    Several sensor-less wavefront aberration correction methods that correct nonlinear wavefront aberrations by maximizing the optical coherence tomography (OCT) signal are tested on an OCT setup. A conventional coordinate search method is compared to two model-based optimization methods. The first model-based method takes advantage of the well-known optimization algorithm (NEWUOA) and utilizes a quadratic model. The second model-based method (DONE) is new and utilizes a random multidimensional Fourier-basis expansion. The model-based algorithms achieve lower wavefront errors with up to ten times fewer measurements. Furthermore, the newly proposed DONE method outperforms the NEWUOA method significantly. The DONE algorithm is tested on OCT images and shows a significantly improved image quality. PMID:26670496

  4. Quantitative fully 3D PET via model-based scatter correction

    SciTech Connect

    Ollinger, J.M.

    1994-05-01

    We have investigated the quantitative accuracy of fully 3D PET using model-based scatter correction by measuring the half-life of Ga-68 in the presence of scatter from F-18. The inner chamber of a Data Spectrum cardiac phantom was filled with 18.5 MBq of Ga-68. The outer chamber was filled with an equivalent amount of F-18. The cardiac phantom was placed in a 22x30.5 cm elliptical phantom containing anthropomorphic lung inserts filled with a water-Styrofoam mixture. Ten frames of dynamic data were collected over 13.6 hours on Siemens-CTI 953B scanner with the septa retracted. The data were corrected using model-based scatter correction, which uses the emission images, transmission images and an accurate physical model to directly calculate the scatter distribution. Both uncorrected and corrected data were reconstructed using the Promis algorithm. The scatter correction required 4.3% of the total reconstruction time. The scatter fraction in a small volume of interest in the center of the inner chamber of the cardiac insert rose from 4.0% in the first interval to 46.4% in the last interval as the ratio of F-18 activity to Ga-68 activity rose from 1:1 to 33:1. Fitting a single exponential to the last three data points yields estimates of the half-life of Ga-68 of 77.01 minutes and 68.79 minutes for uncorrected and corrected data respectively. Thus, scatter correction reduces the error from 13.3% to 1.2%. This suggests that model-based scatter correction is accurate in the heterogeneous attenuating medium found in the chest, making possible quantitative, fully 3D PET in the body.

  5. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  6. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  7. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  8. Model based correction of placement error in EBL and its verification

    NASA Astrophysics Data System (ADS)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  9. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance.

  10. Correcting encoder interpolation error on the Green Bank Telescope using an iterative model based identification algorithm

    NASA Astrophysics Data System (ADS)

    Franke, Timothy; Weadon, Tim; Ford, John; Garcia-Sanz, Mario

    2015-10-01

    Various forms of measurement errors limit telescope tracking performance in practice. A new method for identifying the correcting coefficients for encoder interpolation error is developed. The algorithm corrects the encoder measurement by identifying a harmonic model of the system and using that model to compute the necessary correction parameters. The approach improves upon others by explicitly modeling the unknown dynamics of the structure and controller and by not requiring a separate system identification to be performed. Experience gained from pin-pointing the source of encoder error on the Green Bank Radio Telescope (GBT) is presented. Several tell-tale indicators of encoder error are discussed. Experimental data from the telescope, tested with two different encoders, are presented. Demonstration of the identification methodology on the GBT as well as details of its implementation are discussed. A root mean square tracking error reduction from 0.68 arc seconds to 0.21 arc sec was achieved by changing encoders and was further reduced to 0.10 arc sec with the calibration algorithm. In particular, the ubiquity of this error source is shown and how, by careful correction, it is possible to go beyond the advertised accuracy of an encoder.

  11. A systematic review of model-based economic evaluations of diagnostic and therapeutic strategies for lower extremity artery disease.

    PubMed

    Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L

    2014-01-01

    Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.

  12. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  13. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system.

  14. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  15. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-11-06

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved.

  16. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking

    PubMed Central

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  17. Physiologically corrected coupled motion during gait analysis using a model-based approach.

    PubMed

    Bonnechère, Bruno; Sholukha, Victor; Salvia, Patrick; Rooze, Marcel; Van Sint Jan, Serge

    2015-01-01

    Gait analysis is used in daily clinics for patients' evaluation and follow-up. Stereophotogrammetric devices are the most used tool to perform these analyses. Although these devices are accurate results must be analyzed carefully due to relatively poor reproducibility. One of the major issues is related to skin displacement artifacts. Motion representation is recognized reliable for the main plane of motion displacement, but secondary motions, or combined, are less reliable because of the above artifacts. Model-based approach (MBA) combining accurate joint kinematics and motion data was previously developed based on a double-step registration method. This study presents an extensive validation of this MBA method by comparing results with a conventional motion representation model. Thirty five healthy subjects participated to this study. Gait motion data were obtained from a stereophotogrammetric system. Plug-in Gait model (PiG) and MBA were applied to raw data, results were then compared. Range-of-motion, were computed for pelvis, hip, knee and ankle joints. Differences between PiG and MBA were then computed. Paired-sample t-tests were used to compare both methods. Normalized root-mean square errors were also computed. Shapes of the curves were compared using coefficient of multiple correlations. The MBA and PiG approaches shows similar results for the main plane of motion displacement but statistically significative discrepancies appear for the combined motions. MBA appear to be usable in applications (such as musculoskeletal modeling) requesting better approximations of the joints-of-interest thanks to the integration of validated joint mechanisms.

  18. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  19. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  20. Splicing-correcting therapeutic approaches for retinal dystrophies: where endogenous gene regulation and specificity matter.

    PubMed

    Bacchi, Niccolò; Casarosa, Simona; Denti, Michela A

    2014-05-27

    Splicing is an important and highly regulated step in gene expression. The ability to modulate it can offer a therapeutic option for many genetic disorders. Antisense-mediated splicing-correction approaches have recently been successfully exploited for some genetic diseases, and are currently demonstrating safety and efficacy in different clinical trials. Their application for the treatment of retinal dystrophies could potentially solve a vast panel of cases, as illustrated by the abundance of mutations that could be targeted and the versatility of the technique. In this review, we will give an insight of the different therapeutic strategies, focusing on the current status of their application for retinal dystrophies.

  1. [Hydrofluoric acid burns of the hands in the home environment: correct therapeutic approach].

    PubMed

    Nicoletti, Giovanni; Pellegatta, Tommaso

    2014-01-01

    The broad market penetration of products with components used primarily in the industrial sector requires the precise knowledge of their mechanism of action in order to perform a correct therapeutic approach. The article reports on three cases of domestic hydrofluoric acid burn that came to our Plastic Surgery Unit over the last three years. The treatment options are discussed in detail with emphasis on the importance of a constant update about such emerging diseases.

  2. Efficient model-based dummy-fill OPC correction flow for deep sub-micron technology nodes

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Salama, Mohamed

    2014-09-01

    Dummy fill insertion is a necessary step in modern semiconductor technologies to achieve homogeneous pattern density per layer. This benefits several fabrication process steps including but not limited to Chemical Mechanical Polishing (CMP), Etching, and Packaging. As the technology keeps shrinking, fill shapes become more challenging to pattern and require aggressive model based optical proximity correction (MBOPC) to achieve better design fidelity. MBOPC on Fill is a challenge to mask data prep runtime and final mask shot count which would affect the total turnaround time (TAT) and mask cost. In our work, we introduce a novel flow that achieves a robust and computationally efficient fill handling methodology during mask data prep, which will keep both the runtime and shot count within their acceptable levels. In this flow, fill shapes undergo a smart MBOPC step which improves the final wafer printing quality and topography uniformity without degrading the final shot count or the OPC cycle runtime. This flow is tested on both front end of line (FEOL) layers and backend of line (BEOL) layers, and results in an improved final printing of the fill patterns while consuming less than 2% of the full MBOPC flow runtime.

  3. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities.

    PubMed

    Warren, Keith L; Doogan, Nathan; De Leon, George; Phillips, Gary S; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  4. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    PubMed Central

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  5. A three-dimensional model-based partial volume correction strategy for gated cardiac mouse PET imaging

    NASA Astrophysics Data System (ADS)

    Dumouchel, Tyler; Thorn, Stephanie; Kordos, Myra; DaSilva, Jean; Beanlands, Rob S. B.; deKemp, Robert A.

    2012-07-01

    Quantification in cardiac mouse positron emission tomography (PET) imaging is limited by the imaging spatial resolution. Spillover of left ventricle (LV) myocardial activity into adjacent organs results in partial volume (PV) losses leading to underestimation of myocardial activity. A PV correction method was developed to restore accuracy of the activity distribution for FDG mouse imaging. The PV correction model was based on convolving an LV image estimate with a 3D point spread function. The LV model was described regionally by a five-parameter profile including myocardial, background and blood activities which were separated into three compartments by the endocardial radius and myocardium wall thickness. The PV correction was tested with digital simulations and a physical 3D mouse LV phantom. In vivo cardiac FDG mouse PET imaging was also performed. Following imaging, the mice were sacrificed and the tracer biodistribution in the LV and liver tissue was measured using a gamma-counter. The PV correction algorithm improved recovery from 50% to within 5% of the truth for the simulated and measured phantom data and image uniformity by 5-13%. The PV correction algorithm improved the mean myocardial LV recovery from 0.56 (0.54) to 1.13 (1.10) without (with) scatter and attenuation corrections. The mean image uniformity was improved from 26% (26%) to 17% (16%) without (with) scatter and attenuation corrections applied. Scatter and attenuation corrections were not observed to significantly impact PV-corrected myocardial recovery or image uniformity. Image-based PV correction algorithm can increase the accuracy of PET image activity and improve the uniformity of the activity distribution in normal mice. The algorithm may be applied using different tracers, in transgenic models that affect myocardial uptake, or in different species provided there is sufficient image quality and similar contrast between the myocardium and surrounding structures.

  6. Validation of model-based pelvis bone segmentation from MR images for PET/MR attenuation correction

    NASA Astrophysics Data System (ADS)

    Renisch, S.; Blaffert, T.; Tang, J.; Hu, Z.

    2012-02-01

    With the recent introduction of combined Magnetic Resonance Imaging (MRI) / Positron Emission Tomography (PET) systems, the generation of attenuation maps for PET based on MR images gained substantial attention. One approach for this problem is the segmentation of structures on the MR images with subsequent filling of the segments with respective attenuation values. Structures of particular interest for the segmentation are the pelvis bones, since those are among the most heavily absorbing structures for many applications, and they can serve at the same time as valuable landmarks for further structure identification. In this work the model-based segmentation of the pelvis bones on gradient-echo MR images is investigated. A processing chain for the detection and segmentation of the pelvic bones is introduced, and the results are evaluated using CT-generated "ground truth" data. The results indicate that a model based segmentation of the pelvis bone is feasible with moderate requirements to the pre- and postprocessing steps of the segmentation.

  7. Model-Based Assessment of Plasma Citrate Flux Into the Liver: Implications for NaCT as a Therapeutic Target.

    PubMed

    Li, Z; Erion, D M; Maurer, T S

    2016-03-01

    Cytoplasmic citrate serves as an important regulator of gluconeogenesis and carbon source for de novo lipogenesis in the liver. For this reason, the sodium-coupled citrate transporter (NaCT), a plasma membrane transporter that governs hepatic influx of plasma citrate in human, is being explored as a potential therapeutic target for metabolic disorders. As cytoplasmic citrate also originates from intracellular mitochondria, the relative contribution of these two pathways represents critical information necessary to underwrite confidence in this target. In this work, hepatic influx of plasma citrate was quantified via pharmacokinetic modeling of published clinical data. The influx was then compared to independent literature estimates of intracellular citrate flux in human liver. The results indicate that, under normal conditions, <10% of hepatic citrate originates from plasma. Similar estimates were determined experimentally in mice and rats. This suggests that NaCT inhibition will have a limited impact on hepatic citrate concentrations across species. PMID:27069776

  8. A Model-based approach for microvasculature structure distortion correction in two-photon fluorescence microscopy images.

    PubMed

    Dao, Lam; Glancy, Brian; Lucotte, Bertrand; Chang, Lin-Ching; Balaban, Robert S; Hsu, Li-Yueh

    2015-11-01

    This paper investigates a postprocessing approach to correct spatial distortion in two-photon fluorescence microscopy images for vascular network reconstruction. It is aimed at in vivo imaging of large field-of-view, deep-tissue studies of vascular structures. Based on simple geometric modelling of the object-of-interest, a distortion function is directly estimated from the image volume by deconvolution analysis. Such distortion function is then applied to subvolumes of the image stack to adaptively adjust for spatially varying distortion and reduce the image blurring through blind deconvolution. The proposed technique was first evaluated in phantom imaging of fluorescent microspheres that are comparable in size to the underlying capillary vascular structures. The effectiveness of restoring three-dimensional (3D) spherical geometry of the microspheres using the estimated distortion function was compared with empirically measured point-spread function. Next, the proposed approach was applied to in vivo vascular imaging of mouse skeletal muscle to reduce the image distortion of the capillary structures. We show that the proposed method effectively improve the image quality and reduce spatially varying distortion that occurs in large field-of-view deep-tissue vascular dataset. The proposed method will help in qualitative interpretation and quantitative analysis of vascular structures from fluorescence microscopy images.

  9. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  10. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in the NOTCH3 gene. NOTCH3 mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specific NOTCH3 exons. Selection of these exons was achieved using in silico studies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targeted NOTCH3 exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations

  11. Correction.

    PubMed

    2015-11-01

    In the article by Heuslein et al, which published online ahead of print on September 3, 2015 (DOI: 10.1161/ATVBAHA.115.305775), a correction was needed. Brett R. Blackman was added as the penultimate author of the article. The article has been corrected for publication in the November 2015 issue. PMID:26490278

  12. [Beat therapeutic inertia in dyslipidemic patient management: A challenge in daily clinical practice] [corrected].

    PubMed

    Morales, Clotilde; Mauri, Marta; Vila, Lluís

    2014-01-01

    Beat therapeutic inertia in dyslipidemic patient management: a challenge in daily clinical practice. In patients with dyslipidemia, there is the need to reach the therapeutic goals in order to get the maximum benefit in the cardiovascular events risk reduction, especially myocardial infarction. Even having guidelines and some powerful hypolipidemic drugs, the goals of low-density lipoprotein-cholesterol (LDL-c) are often not reached, being of special in patients with a high cardiovascular risk. One of the causes is the therapeutic inertia. There are tools to plan the treatment and make the decisions easier. One of the challenges in everyday clinical practice is to know the needed percentage of reduction in LDL-c. Moreover: it is hard to know which one is the treatment we should use in the beginning of the treatment but also when the desired objective is not reached. This article proposes a practical method that can help solving these questions.

  13. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    SciTech Connect

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for building parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the

  14. Whole-Body PET/MR Imaging: Quantitative Evaluation of a Novel Model-Based MR Attenuation Correction Method Including Bone

    PubMed Central

    Paulus, Daniel H.; Quick, Harald H.; Geppert, Christian; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Faul, David; Boada, Fernando; Friedman, Kent P.; Koesters, Thomas

    2016-01-01

    In routine whole-body PET/MR hybrid imaging, attenuation correction (AC) is usually performed by segmentation methods based on a Dixon MR sequence providing up to 4 different tissue classes. Because of the lack of bone information with the Dixon-based MR sequence, bone is currently considered as soft tissue. Thus, the aim of this study was to evaluate a novel model-based AC method that considers bone in whole-body PET/MR imaging. Methods The new method (“Model”) is based on a regular 4-compartment segmentation from a Dixon sequence (“Dixon”). Bone information is added using a model-based bone segmentation algorithm, which includes a set of prealigned MR image and bone mask pairs for each major body bone individually. Model was quantitatively evaluated on 20 patients who underwent whole-body PET/MR imaging. As a standard of reference, CT-based μ-maps were generated for each patient individually by nonrigid registration to the MR images based on PET/CT data. This step allowed for a quantitative comparison of all μ-maps based on a single PET emission raw dataset of the PET/MR system. Volumes of interest were drawn on normal tissue, soft-tissue lesions, and bone lesions; standardized uptake values were quantitatively compared. Results In soft-tissue regions with background uptake, the average bias of SUVs in background volumes of interest was 2.4% ± 2.5% and 2.7% ± 2.7% for Dixon and Model, respectively, compared with CT-based AC. For bony tissue, the −25.5% ± 7.9% underestimation observed with Dixon was reduced to −4.9% ± 6.7% with Model. In bone lesions, the average underestimation was −7.4% ± 5.3% and −2.9% ± 5.8% for Dixon and Model, respectively. For soft-tissue lesions, the biases were 5.1% ± 5.1% for Dixon and 5.2% ± 5.2% for Model. Conclusion The novel MR-based AC method for whole-body PET/MR imaging, combining Dixon-based soft-tissue segmentation and model-based bone estimation, improves PET quantification in whole-body hybrid PET

  15. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera

    NASA Astrophysics Data System (ADS)

    Holstensson, M.; Erlandsson, K.; Poludniowski, G.; Ben-Haim, S.; Hutton, B. F.

    2015-04-01

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of 99mTc and 123I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of 99mTc and 123I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both 99mTc and 123I, results show that the estimated spatial distribution of events from 99mTc in the 99mTc photopeak energy window is very similar to that measured in a single 99mTc phantom study. The extracted images of primary events display increased cold lesion contrasts for both 99mTc and 123I.

  16. Corrections

    NASA Astrophysics Data System (ADS)

    2012-09-01

    The feature article "Material advantage?" on the effects of technology and rule changes on sporting performance (July pp28-30) stated that sprinters are less affected by lower oxygen levels at high altitudes because they run "aerobically". They run anaerobically. The feature about the search for the Higgs boson (August pp22-26) incorrectly gave the boson's mass as roughly 125 MeV it is 125 GeV, as correctly stated elsewhere in the issue. The article also gave a wrong value for the intended collision energy of the Superconducting Super Collider, which was designed to collide protons with a total energy of 40 TeV.

  17. Correction.

    PubMed

    2015-05-22

    The Circulation Research article by Keith and Bolli (“String Theory” of c-kitpos Cardiac Cells: A New Paradigm Regarding the Nature of These Cells That May Reconcile Apparently Discrepant Results. Circ Res. 2015:116:1216-1230. doi: 10.1161/CIRCRESAHA.116.305557) states that van Berlo et al (2014) observed that large numbers of fibroblasts and adventitial cells, some smooth muscle and endothelial cells, and rare cardiomyocytes originated from c-kit positive progenitors. However, van Berlo et al reported that only occasional fibroblasts and adventitial cells derived from c-kit positive progenitors in their studies. Accordingly, the review has been corrected to indicate that van Berlo et al (2014) observed that large numbers of endothelial cells, with some smooth muscle cells and fibroblasts, and more rarely cardiomyocytes, originated from c-kit positive progenitors in their murine model. The authors apologize for this error, and the error has been noted and corrected in the online version of the article, which is available at http://circres.ahajournals.org/content/116/7/1216.full ( PMID:25999426

  18. Gene transfer corrects acute GM2 gangliosidosis--potential therapeutic contribution of perivascular enzyme flow.

    PubMed

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-08-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay-Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity-as opposed to tremor-ataxia-were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue-long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  19. Gene Transfer Corrects Acute GM2 Gangliosidosis—Potential Therapeutic Contribution of Perivascular Enzyme Flow

    PubMed Central

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-01-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay–Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity—as opposed to tremor-ataxia—were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue—long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  20. Tafenoquine at therapeutic concentrations does not prolong fridericia-corrected QT interval in healthy subjects

    PubMed Central

    Green, Justin A; Patel, Apurva K; Patel, Bela R; Hussaini, Azra; Harrell, Emma J; McDonald, Mirna J; Carter, Nick; Mohamed, Khadeeja; Duparc, Stephan; Miller, Ann K

    2014-01-01

    Tafenoquine is being developed for relapse prevention in Plasmodium vivax malaria. This Phase I, single-blind, randomized, placebo- and active-controlled parallel group study investigated whether tafenoquine at supratherapeutic and therapeutic concentrations prolonged cardiac repolarization in healthy volunteers. Subjects aged 18–65 years were randomized to one of five treatment groups (n = 52 per group) to receive placebo, tafenoquine 300, 600, or 1200 mg, or moxifloxacin 400 mg (positive control). Lack of effect was demonstrated if the upper 90% CI of the change from baseline in QTcF following supratherapeutic tafenoquine 1200 mg versus placebo (ΔΔQTcF) was <10 milliseconds for all pre-defined time points. The maximum ΔΔQTcF with tafenoquine 1200 mg (n = 50) was 6.39 milliseconds (90% CI 2.85, 9.94) at 72 hours post-final dose; that is, lack of effect for prolongation of cardiac depolarization was demonstrated. Tafenoquine 300 mg (n = 48) or 600 mg (n = 52) had no effect on ΔΔQTcF. Pharmacokinetic/pharmacodynamic modeling of the tafenoquine–QTcF concentration–effect relationship demonstrated a shallow slope (0.5 ms/μg mL–1) over a wide concentration range. For moxifloxacin (n = 51), maximum ΔΔQTcF was 8.52 milliseconds (90% CI 5.00, 12.04), demonstrating assay sensitivity. In this thorough QT/QTc study, tafenoquine did not have a clinically meaningful effect on cardiac repolarization. PMID:24700490

  1. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2013-10-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose (18F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in most

  2. Influence of the partial volume correction method on (18)F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM.

    PubMed

    Bowen, Spencer L; Byars, Larry G; Michel, Christian J; Chonde, Daniel B; Catana, Ciprian

    2013-10-21

    Kinetic parameters estimated from dynamic (18)F-fluorodeoxyglucose ((18)F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting (18)F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in

  3. Allometric Scaling of Therapeutic Monoclonal Antibodies Using Antigen Concentration as a Correction Factor: Application to the Human Clearance Prediction.

    PubMed

    Wang, Lei; Qiang, Wei; Cheng, Zeneng

    2016-03-01

    Allometric scaling has been widely used for predictions of human pharmacokinetic (PK) parameters in the development of monoclonal antibody (mAb) drugs, and some correction factors have been proposed to improve the estimations. However, classic correction factors fail to offer a complete explanation of the additional differences among species besides the body weight and, thus, lack enough power to further improve the predictions. In this study, the antigen concentration was initially set as a new correction factor to predict the human clearance (CL) of mAbs. Bevacizumab was intravenously injected into 2 animal species and humans to obtain PK data to predict human CL from the animal data. Additionally, a new approach was also validated with data from 3 other mAbs which were collected through a literature review of published work. Accordingly, allometric scaling with a correction factor of the antigen concentration generated accurate estimations of the human CL of 4 mAbs, which were superior to the results obtained by other classic scaling methods. More importantly, the proposed method also achieved good predictions of individual human CL of bevacizumab. In conclusion, the potential of this method as a powerful tool for human PK estimation of mAbs in species translation has been demonstrated. PMID:26886347

  4. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  5. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  6. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC

    PubMed Central

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  7. Determination of the quenching correction factors for plastic scintillation detectors in therapeutic high-energy proton beams

    PubMed Central

    Wang, L L W; Perles, L A; Archambault, L; Sahoo, N; Mirkovic, D; Beddar, S

    2013-01-01

    The plastic scintillation detectors (PSD) have many advantages over other detectors in small field dosimetry due to its high spatial resolution, excellent water equivalence and instantaneous readout. However, in proton beams, the PSDs will undergo a quenching effect which makes the signal level reduced significantly when the detector is close to Bragg peak where the linear energy transfer (LET) for protons is very high. This study measures the quenching correction factor (QCF) for a PSD in clinical passive-scattering proton beams and investigates the feasibility of using PSDs in depth-dose measurements in proton beams. A polystyrene based PSD (BCF-12, ϕ0.5mm×4mm) was used to measure the depth-dose curves in a water phantom for monoenergetic unmodulated proton beams of nominal energies 100, 180 and 250 MeV. A Markus plane-parallel ion chamber was also used to get the dose distributions for the same proton beams. From these results, the QCF as a function of depth was derived for these proton beams. Next, the LET depth distributions for these proton beams were calculated by using the MCNPX Monte Carlo code, based on the experimentally validated nozzle models for these passive-scattering proton beams. Then the relationship between the QCF and the proton LET could be derived as an empirical formula. Finally, the obtained empirical formula was applied to the PSD measurements to get the corrected depth-dose curves and they were compared to the ion chamber measurements. A linear relationship between QCF and LET, i.e. Birks' formula, was obtained for the proton beams studied. The result is in agreement with the literature. The PSD measurements after the quenching corrections agree with ion chamber measurements within 5%. PSDs are good dosimeters for proton beam measurement if the quenching effect is corrected appropriately. PMID:23128412

  8. Key factors which concur to the correct therapeutic evaluation of herbal products in free radical-induced diseases

    PubMed Central

    Mancuso, Cesare

    2015-01-01

    For many years now the world’s scientific literature has been perfused with articles on the therapeutic potential of natural products, the vast majority of which have herbal origins, as in the case of free radical-induced diseases. What is often overlooked is the effort of researchers who take into consideration the preclinical and clinical evaluation of these herbal products, in order to demonstrate the therapeutic efficacy and safety. The first critical issue to be addressed in the early stages of the preclinical studies is related to pharmacokinetics, which is sometimes not very favorable, of some of these products, which limits the bioavailability after oral intake. In this regard, it is worthy underlining how it is often unethical to propose the therapeutic efficacy of a compound on the basis of preclinical results obtained with far higher concentrations to those which, hopefully, could be achieved in organs and tissues of subjects taking these products by mouth. The most widely used approach to overcome the problem related to the low bioavailability involves the complexation of the active ingredients of herbal products with non-toxic carriers that facilitate the absorption and distribution. Even the induction or inhibition of drug metabolizing enzymes by herbal products, and the consequent variations of plasma concentrations of co-administered drugs, are phenomena to be carefully evaluated as they can give rise to side-effects. This risk is even greater when considering that people lack the perception of the risk arising from an over use of herbal products that, by their very nature, are considered risk-free. PMID:25954201

  9. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics

    PubMed Central

    Chung, W. Joon; Goeckeler-Fried, Jennifer L.; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M.; Plyler, Zackery E.; Hong, Jeong S.; Mazur, Marina; Piazza, Gary A.; Keeton, Adam B.; White, E. Lucile; Rasmussen, Lynn; Weissman, Allan M.; Denny, R. Aldrin; Brodsky, Jeffrey L.; Sorscher, Eric J.

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of “repairable” F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident “Band B” F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41—a compound that inhibits the E1 ubiquitin activating enzyme—was shown to synergistically enhance F508del rescue by C

  10. CORRECTED ERROR VIDEO VERSUS A PHYSICAL THERAPIST INSTRUCTED HOME EXERCISE PROGRAM: ACCURACY OF PERFORMING THERAPEUTIC SHOULDER EXERCISES

    PubMed Central

    Krishnamurthy, Kamesh; Hopp, Jennifer; Stanley, Laura; Spores, Ken; Braunreiter, David

    2016-01-01

    Background and Purpose The accurate performance of physical therapy exercises can be difficult. In this evolving healthcare climate it is important to continually look for better methods to educate patients. The use of handouts, in-person demonstration, and video instruction are all potential avenues used to teach proper exercise form. The purpose of this study was to examine if a corrected error video (CEV) would be as effective as a single visit with a physical therapist (PT) to teach healthy subjects how to properly perform four different shoulder rehabilitation exercises. Study Design This was a prospective, single-blinded interventional trial. Methods Fifty-eight subjects with no shoulder complaints were recruited from two institutions and randomized into one of two groups: the CEV group (30 subjects) was given a CEV comprised of four shoulder exercises, while the physical therapy group (28 subjects) had one session with a PT as well as a handout of how to complete the exercises. Each subject practiced the exercises for one week and was then videotaped performing them during a return visit. Videos were scored with the shoulder exam assessment tool (SEAT) created by the authors. Results There was no difference between the groups on total SEAT score (13.66 ± 0.29 vs 13.46 ± 0.30 for CEV vs PT, p = 0.64, 95% CI [−0.06, 0.037]). Average scores for individual exercises also showed no significant difference. Conclusion/Clinical Relevance These results demonstrate that the inexpensive and accessible CEV is as beneficial as direct instruction in teaching subjects to properly perform shoulder rehabilitation exercises. Level of Evidence 1b PMID:27757288

  11. [The role of therapeutic exercises in the correction of the static component of the motor stereotype in the patients presenting with cervical myofascial pain syndrome].

    PubMed

    Mel'nikova, E Iu; Khodasevich, L S; Poliakova, A V; Bartashevich, V V

    2014-01-01

    The present study was designed to estimate the influence of various modalities of therapeutic exercises on the static component of the motor stereotype in 200 patients presenting with cervical myofascial pain syndrome. The study was performed with the use of computed optical topography. The course of therapeutic exercises included 10 sessions of the total duration of 14 days. It is concluded based on the data obtained in this study that remedial gymnastics based on the understanding of the internal body model ("body scheme") with the use of static symmetric exercises is 2.7 times as effective as the traditional approach.

  12. [The role of therapeutic exercises in the correction of the static component of the motor stereotype in the patients presenting with cervical myofascial pain syndrome].

    PubMed

    Mel'nikova, E Iu; Khodasevich, L S; Poliakova, A V; Bartashevich, V V

    2014-01-01

    The present study was designed to estimate the influence of various modalities of therapeutic exercises on the static component of the motor stereotype in 200 patients presenting with cervical myofascial pain syndrome. The study was performed with the use of computed optical topography. The course of therapeutic exercises included 10 sessions of the total duration of 14 days. It is concluded based on the data obtained in this study that remedial gymnastics based on the understanding of the internal body model ("body scheme") with the use of static symmetric exercises is 2.7 times as effective as the traditional approach. PMID:24665596

  13. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  14. Therapeutic efficacy of antibodies lacking Fcγ receptor binding against lethal dengue virus infection is due to neutralizing potency and blocking of enhancing antibodies [corrected].

    PubMed

    Williams, Katherine L; Sukupolvi-Petty, Soila; Beltramello, Martina; Johnson, Syd; Sallusto, Federica; Lanzavecchia, Antonio; Diamond, Michael S; Harris, Eva

    2013-02-01

    Dengue hemorrhagic fever and dengue shock syndrome (DHF/DSS) are life-threatening complications following infection with one of the four serotypes of dengue virus (DENV). At present, no vaccine or antiviral therapies are available against dengue. Here, we characterized a panel of eight human or mouse-human chimeric monoclonal antibodies (MAbs) and their modified variants lacking effector function and dissected the mechanism by which some protect against antibody-enhanced lethal DENV infection. We found that neutralizing modified MAbs that recognize the fusion loop or the A strand epitopes on domains II and III of the envelope protein, respectively, act therapeutically by competing with and/or displacing enhancing antibodies. By analyzing these relationships, we developed a novel in vitro suppression-of-enhancement assay that predicts the ability of modified MAbs to act therapeutically against antibody-enhanced disease in vivo. These studies provide new insight into the biology of DENV pathogenesis and the requirements for antibodies to treat lethal DENV disease.

  15. Intelligent model-based OPC

    NASA Astrophysics Data System (ADS)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Chih, M. H.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.

    2006-03-01

    Optical proximity correction is the technique of pre-distorting mask layouts so that the printed patterns are as close to the desired shapes as possible. For model-based optical proximity correction, a lithographic model to predict the edge position (contour) of patterns on the wafer after lithographic processing is needed. Generally, segmentation of edges is performed prior to the correction. Pattern edges are dissected into several small segments with corresponding target points. During the correction, the edges are moved back and forth from the initial drawn position, assisted by the lithographic model, to finally settle on the proper positions. When the correction converges, the intensity predicted by the model in every target points hits the model-specific threshold value. Several iterations are required to achieve the convergence and the computation time increases with the increase of the required iterations. An artificial neural network is an information-processing paradigm inspired by biological nervous systems, such as how the brain processes information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. A neural network can be a powerful data-modeling tool that is able to capture and represent complex input/output relationships. The network can accurately predict the behavior of a system via the learning procedure. A radial basis function network, a variant of artificial neural network, is an efficient function approximator. In this paper, a radial basis function network was used to build a mapping from the segment characteristics to the edge shift from the drawn position. This network can provide a good initial guess for each segment that OPC has carried out. The good initial guess reduces the required iterations. Consequently, cycle time can be shortened effectively. The optimization of the radial basis function network for this system was practiced by genetic algorithm

  16. [Severe and prolonged post-dural puncture headache: from pathological basis to therapeutic role and correct timing for epidural blood patch].

    PubMed

    Wetzl, R G; Taglione, G; Ceresa, F; D'Agostino, R; Foresta, S; Guarnerio, C; Ladiana, N; Megaro, F; Zanesi, R; De Vietro, A; Pavani, M

    2001-09-01

    Believed to be due to unbalance between cerebrospinal fluid (CSF) production rate and its loss through the spinal dural puncture hole, post-dural puncture headache (PDPH) is often considered as a physiological syndrome, usually reversible without pathological sequelae after dural hole's closure. The clinical case here presented (incapacitating headache associated with diagnostic dural puncture in a leukaemic young female patient who underwent bone marrow transplantation) shows potentially fatal pathological sequelae following prolonged headache (untreated, due to the severe postransplant immunodeficiency and coagulopathy). The observed RMI lesions suggest interesting conclusions about the clinical indications and correct timing of autologous epidural blood patch (EBP). We also suggest the ways to preventing rebound intracranial hypertension following autologous epidural blood patch in patients suffering from incapacitating and prolonged headache.

  17. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  18. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  19. Political Correctness--Correct?

    ERIC Educational Resources Information Center

    Boase, Paul H.

    1993-01-01

    Examines the phenomenon of political correctness, its roots and objectives, and its successes and failures in coping with the conflicts and clashes of multicultural campuses. Argues that speech codes indicate failure in academia's primary mission to civilize and educate through talk, discussion, thought,166 and persuasion. (SR)

  20. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  1. Are therapeutic communities therapeutic for women?

    PubMed Central

    Eliason, Michele J

    2006-01-01

    This paper addresses the growing phenomena of therapeutic community (TC) treatment approaches for women in correctional settings. Although rapidly increasing in number across the country, there is very little empirical research to support the effectiveness of TC treatment for women. Therefore, the literature on the efficacy and effectiveness of TC treatment for women is reviewed in relation to the literature on women's treatment issues. The literature review highlights the gaps where TC treatment ignores or exacerbates issues that are common to addicted women, or uses methods that may be contradictory to women's recovery. PMID:16722560

  2. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  3. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  4. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  5. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  6. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  7. MACROMOLECULAR THERAPEUTICS

    PubMed Central

    Yang, Jiyuan; Kopeček, Jindřich

    2014-01-01

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines – (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. PMID:24747162

  8. The Challenge of Configuring Model-Based Space Mission Planners

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy D.; Clement, Bradley J.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    Mission planning is central to space mission operations, and has benefited from advances in model-based planning software. Constraints arise from many sources, including simulators and engineering specification documents, and ensuring that constraints are correctly represented in the planner is a challenge. As mission constraints evolve, planning domain modelers need help with modeling constraints efficiently using the available source data, catching errors quickly, and correcting the model. This paper describes the current state of the practice in designing model-based mission planning tools, the challenges facing model developers, and a proposed Interactive Model Development Environment (IMDE) to configure mission planning systems. We describe current and future technology developments that can be integrated into an IMDE.

  9. Electroweak Corrections

    NASA Astrophysics Data System (ADS)

    Barbieri, Riccardo

    2016-10-01

    The test of the electroweak corrections has played a major role in providing evidence for the gauge and the Higgs sectors of the Standard Model. At the same time the consideration of the electroweak corrections has given significant indirect information on the masses of the top and the Higgs boson before their discoveries and important orientation/constraints on the searches for new physics, still highly valuable in the present situation. The progression of these contributions is reviewed.

  10. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen; Ruegsegger, Mark; Barnes, Philip; Smith, Bryan; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'être of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multistep work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self-assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  11. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen C.; Ruegsegger, Mark; Barnes, Philip D.; Smith, Bryan R.; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'etre of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multi-step work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  12. Platelet-delivered therapeutics.

    PubMed

    Lyde, R; Sabatino, D; Sullivan, S K; Poncz, M

    2015-06-01

    We have proposed that modified platelets could potentially be used to correct intrinsic platelet defects as well as for targeted delivery of therapeutic molecules to sights of vascular injury. Ectopic expression of proteins within α-granules prior to platelet activation has been achieved for several proteins, including urokinase, factor (F) VIII, and partially for FIX. Potential uses of platelet-directed therapeutics will be discussed, focusing on targeted delivery of urokinase as a thromboprophylactic agent and FVIII for the treatment of hemophilia A patients with intractable inhibitors. This presentation will discuss new strategies that may be useful in the care of patients with vascular injury as well as remaining challenges and limitations of these approaches.

  13. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  14. Therapeutic perspectives

    PubMed Central

    Fiore, Carmelo E.; Pennisi, Pietra; Tinè, Marianna

    2008-01-01

    Osteoporosis and atherosclerosis are linked by biological association. This encourages the search for therapeutic strategies having both cardiovascular and skeletal beneficial effects. Among drugs that may concordantly enhance bone density and reduce the progression of atherosclerosis we can include bisphosphonates (BP), statins, β -blockers, and possibly anti-RANKL antibodies. Available data come from experimental animals and human studies. All these treatments however lack controlled clinical studies designed to demonstrate dual-action effects. PMID:22460845

  15. Therapeutic insemination.

    PubMed

    Alexander, N J; Ackerman, S

    1987-12-01

    Except in special circumstances, therapeutic insemination with a husband's sample has a low success rate. Couples in whom oligozoospermia has been identified as the principal cause of infertility do not benefit from therapeutic insemination by husband. Because of this low success rate, intrauterine insemination to provide sperm in closer proximity to the egg has become popular, but intrauterine insemination also has a low success rate. We suggest that intrauterine insemination should be approached aggressively in cases of male factor infertility. The recipient should be stimulated to enhance egg production and closely monitored for ovulation. A semen specimen of not less than 1 X 10(6) motile sperm with antibiotics added should be placed in the uterus the day after ovulation. If no pregnancies occur within four cycles, alternate approaches should be considered. Therapeutic insemination by donor involves careful donor selection to avoid inheritance of malformations and familial diseases. Because of the possibilities of sexually transmitted diseases, careful and repeated screening should be conducted. A complete sexual history should be obtained, and donors should be excluded if they have had any homosexual contact since 1978, if they have been an intravenous drug user, if they come from a geographic area where the sex ratio of AIDS is close to 1:1, or if they have recently had multiple sexual partners. A permanent record preserving the confidentiality but allowing the tracing of genetic anomalies, even if not present at birth, should be kept. PMID:3328130

  16. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  17. Jitter Correction

    NASA Technical Reports Server (NTRS)

    Waegell, Mordecai J.; Palacios, David M.

    2011-01-01

    Jitter_Correct.m is a MATLAB function that automatically measures and corrects inter-frame jitter in an image sequence to a user-specified precision. In addition, the algorithm dynamically adjusts the image sample size to increase the accuracy of the measurement. The Jitter_Correct.m function takes an image sequence with unknown frame-to-frame jitter and computes the translations of each frame (column and row, in pixels) relative to a chosen reference frame with sub-pixel accuracy. The translations are measured using a Cross Correlation Fourier transformation method in which the relative phase of the two transformed images is fit to a plane. The measured translations are then used to correct the inter-frame jitter of the image sequence. The function also dynamically expands the image sample size over which the cross-correlation is measured to increase the accuracy of the measurement. This increases the robustness of the measurement to variable magnitudes of inter-frame jitter

  18. Speech Correction in the Schools.

    ERIC Educational Resources Information Center

    Eisenson, Jon; Ogilvie, Mardel

    An introduction to the problems and therapeutic needs of school age children whose speech requires remedial attention, the text is intended for both the classroom teacher and the speech correctionist. General considerations include classification and incidence of speech defects, speech correction services, the teacher as a speaker, the mechanism…

  19. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  20. Therapeutic alliance.

    PubMed

    Fox, Valerie

    2002-01-01

    I have been very fortunate in my journey of mental illness. I respond well to medication, but I don't think that is the complete answer to living successfully with serious, persistent mental illness. I believe a person's environment is also of utmost importance, enabling the person suffering with mental illness to continually grow in life. I found early in my struggle with mental illness a psychiatrist with whom I have always had a very good rapport. Until recently I didn't know that what I have with this psychiatrist is professionally known as a therapeutic alliance. Over the years, when I need someone to talk over anything that is troubling to me, I seek my psychiatrist. A therapeutic alliance is non-judgmental; it is nourishing; and finally it is a relationship of complete trust. Perhaps persons reading this article who have never experienced this alliance will seek it. I believe it can make an insecure person secure; a frightened person less frightened; and allow a person to continue the journey of mental health with a sense of belief in oneself. PMID:12433224

  1. Fast model-based estimation of ancestry in unrelated individuals.

    PubMed

    Alexander, David H; Novembre, John; Lange, Kenneth

    2009-09-01

    Population stratification has long been recognized as a confounding factor in genetic association studies. Estimated ancestries, derived from multi-locus genotype data, can be used to perform a statistical correction for population stratification. One popular technique for estimation of ancestry is the model-based approach embodied by the widely applied program structure. Another approach, implemented in the program EIGENSTRAT, relies on Principal Component Analysis rather than model-based estimation and does not directly deliver admixture fractions. EIGENSTRAT has gained in popularity in part owing to its remarkable speed in comparison to structure. We present a new algorithm and a program, ADMIXTURE, for model-based estimation of ancestry in unrelated individuals. ADMIXTURE adopts the likelihood model embedded in structure. However, ADMIXTURE runs considerably faster, solving problems in minutes that take structure hours. In many of our experiments, we have found that ADMIXTURE is almost as fast as EIGENSTRAT. The runtime improvements of ADMIXTURE rely on a fast block relaxation scheme using sequential quadratic programming for block updates, coupled with a novel quasi-Newton acceleration of convergence. Our algorithm also runs faster and with greater accuracy than the implementation of an Expectation-Maximization (EM) algorithm incorporated in the program FRAPPE. Our simulations show that ADMIXTURE's maximum likelihood estimates of the underlying admixture coefficients and ancestral allele frequencies are as accurate as structure's Bayesian estimates. On real-world data sets, ADMIXTURE's estimates are directly comparable to those from structure and EIGENSTRAT. Taken together, our results show that ADMIXTURE's computational speed opens up the possibility of using a much larger set of markers in model-based ancestry estimation and that its estimates are suitable for use in correcting for population stratification in association studies.

  2. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch. PMID:17839404

  3. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch.

  4. Model-based reconstruction for x-ray diffraction imaging

    NASA Astrophysics Data System (ADS)

    Sridhar, Venkatesh; Kisner, Sherman J.; Skatter, Sondre; Bouman, Charles A.

    2016-05-01

    In this paper, we propose a novel 4D model-based iterative reconstruction (MBIR) algorithm for low-angle scatter X-ray Diffraction (XRD) that can substantially increase the SNR. Our forward model is based on a Poisson photon counting model that incorporates a spatial point-spread function, detector energy response and energy-dependent attenuation correction. Our prior model uses a Markov random field (MRF) together with a reduced spectral bases set determined using non-negative matrix factorization. We demonstrate the effectiveness of our method with real data sets.

  5. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  6. Analysis of Massive Emigration from Poland: The Model-Based Clustering Approach

    NASA Astrophysics Data System (ADS)

    Witek, Ewa

    The model-based approach assumes that data is generated by a finite mixture of probability distributions such as multivariate normal distributions. In finite mixture models, each component of probability distribution corresponds to a cluster. The problem of determining the number of clusters and choosing an appropriate clustering method becomes the problem of statistical model choice. Hence, the model-based approach provides a key advantage over heuristic clustering algorithms, because it selects both the correct model and the number of clusters.

  7. The Therapeutic School.

    ERIC Educational Resources Information Center

    Rice, John Steadman

    2002-01-01

    Contributes to the recent research on specific institutional carriers of the therapeutic culture, such as the state, the corporation, and the self- help movement, defining therapeutic discourse and discussing the therapeutic ethic, the therapeutic school, schools of education and their critics, and disappointing results of therapeutic schooling.…

  8. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  9. Therapeutic Drug Monitoring

    MedlinePlus

    ... be limited. Home Visit Global Sites Search Help? Therapeutic Drug Monitoring Share this page: Was this page ... Monitored Drugs | Common Questions | Related Pages What is therapeutic drug monitoring? Therapeutic drug monitoring is the measurement ...

  10. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  11. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  12. 77 FR 72199 - Technical Corrections; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ...) is correcting a final rule that was published in the Federal Register on July 6, 2012 (77 FR 39899), and effective on August 6, 2012. That final rule amended the NRC regulations to make technical... COMMISSION 10 CFR Part 171 RIN 3150-AJ16 Technical Corrections; Correction AGENCY: Nuclear...

  13. Multimode model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.; Gregory, E.

    2016-02-01

    A newly-initiated research program for model-based defect characterization in CFRP composites is summarized. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing delamination and porosity. Forward predictions of measurement response are presented, as well as examples of model-based inversion of measured data for the estimation of defect parameters.

  14. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  15. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  16. Corrective Jaw Surgery

    MedlinePlus

    ... and Craniofacial Surgery Cleft Lip/Palate and Craniofacial Surgery A cleft lip may require one or more ... find out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment ...

  17. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  18. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  19. A novel methodology for model-based OPC verification

    NASA Astrophysics Data System (ADS)

    Huang, Tengyen; Liao, ChunCheng; Chou, Ryan; Liao, Hung-Yueh; Schacht, Jochen

    2008-03-01

    Model-based optical proximity correction (OPC) is an indispensable production tool enabling successful extension of photolithography down to sub-80nm regime. Commercial OPC software has established clear procedures to produce accurate OPC models at best focus condition. However, OPC models calibrated at best focus condition sometimes fail to prevent catastrophic circuit failure due to patterning short & open caused by accidental shifts of dose/ focus within the corners of allowed processes window. A novel model-based OPC verification methodology is presented in this work, which precisely pinpoints post OPC photolithography failures in VLSI circuits through the entire lithographic process window. By application of a critical photolithography process window model in OPC verification software, we successfully uncovered all weak points of a design prior tape out, eliminating high risk of circuits open & shorts at the extreme corner of the lithographic process window in any complex circuit layout environment. The process window-related information is usually not taken into consideration when running OPC verification procedures with models calibrated at nominal process condition. Intensive review of the critical dimension (CD) and top-view SEM micrographs from the weak points indicate matching between post OPC simulation and measurements. Using a single highly accurate process window resist model provides a reliable OPC verification methodology when used in a field- or grid-based simulation engine ensuring manufacturability within the largest possible process window for any modern critical design.

  20. Model-based image processing using snakes and mutual information

    NASA Astrophysics Data System (ADS)

    von Klinski, Sebastian; Derz, Claus; Weese, David; Tolxdorff, Thomas

    2000-06-01

    Any segmentation approach assumes certain knowledge concerning data modalities, relevant organs and their imaging characteristics. These assumptions are necessary for developing criteria by which to separate the organ in question from the surrounding tissue. Typical assumptions are that the organs have homogeneous gray-value characteristics (region growing, region merging, etc.), specific gray-value patterns (classification methods), continuous edges (edge-based approaches), smooth and strong edges (snake approaches), or any combination of these. In most cases, such assumptions are invalid, at least locally. Consequently, these approaches prove to be time consuming either in their parameterization or execution. Further, the low result quality makes post- processing necessary. Our aim was to develop a segmentation approach for large 3D data sets (e.g., CT and MRI) that requires a short interaction time and that can easily be adapted to different organs and data materials. This has been achieved by exploiting available knowledge about data material and organ topology using anatomical models that have been constructed from previously segmented data sets. In the first step, the user manually specifies the general context of the data material and specifies anatomical landmarks. Then this information is used to automatically select a corresponding reference model, which is geometrically adjusted to the current data set. In the third step, a model-based snake approach is applied to determine the correct segmentation of the organ in question. Analogously, this approach can be used for model-based interpolation and registration.

  1. Model-based Processing of Microcantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Sinensky, A K; Lee, C L; Rudd, R E; Burnham, A K

    2005-04-27

    We have developed a model-based processor (MBP) for a microcantilever-array sensor to detect target species in solution. We perform a proof-of-concept experiment, fit model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest, averaged deflection data and multi-channel data. For this evaluation we extract model parameters via a model-based estimation, perform a Gauss-Markov simulation, design the optimal MBP and apply it to measured experimental data. The performance of the MBP in the multi-channel case is evaluated by comparison to a ''smoother'' (averager) typically used for microcantilever signal analysis. It is shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, apart from a correctable systematic bias error.

  2. Radiometric terrain correction of SPOT5 image

    NASA Astrophysics Data System (ADS)

    Feng, Xiuli; Zhang, Feng; Wang, Ke

    2007-06-01

    terrain correction model based on the rationale of moment matching is an effective model to reduce the shade effect than the traditional C correction approach, especially in the complex undulation of mountain area with lots of shade effect. In other words, the traditional C correction approach will show the better result at the plain area with less shade effect. Besides, the accuracy of the DEM data and the registration accuracy between the image and the DEM data will also influence the final correction accuracy. In order to achieve the higher radiometric terrain correction, high spatial resolution DEM data is preferred.

  3. Integrated Image Reconstruction and Gradient Nonlinearity Correction

    PubMed Central

    Tao, Shengzhen; Trzasko, Joshua D.; Shu, Yunhong; Huston, John; Bernstein, Matt A.

    2014-01-01

    Purpose To describe a model-based reconstruction strategy for routine magnetic resonance imaging (MRI) that accounts for gradient nonlinearity (GNL) during rather than after transformation to the image domain, and demonstrate that this approach reduces the spatial resolution loss that occurs during strictly image-domain GNL-correction. Methods After reviewing conventional GNL-correction methods, we propose a generic signal model for GNL-affected MRI acquisitions, discuss how it incorporates into contemporary image reconstruction platforms, and describe efficient non-uniform fast Fourier transform (NUFFT)-based computational routines for these. The impact of GNL-correction on spatial resolution by the conventional and proposed approaches is investigated on phantom data acquired at varying offsets from gradient isocenter, as well as on fully-sampled and (retrospectively) undersampled in vivo acquisitions. Results Phantom results demonstrate that resolution loss that occurs during GNL-correction is significantly less for the proposed strategy than for the standard approach at distances >10 cm from isocenter with a 35 cm FOV gradient coil. The in vivo results suggest that the proposed strategy better preserves fine anatomical detail than retrospective GNL-correction while offering comparable geometric correction. Conclusion Accounting for GNL during image reconstruction allows geometric distortion to be corrected with less spatial resolution loss than is typically observed with the conventional image domain correction strategy. PMID:25298258

  4. American Therapeutic Recreation Association

    MedlinePlus

    ... Remember Me I forgot my password American Therapeutic Recreation Association Empowering Recreational Therapists Call for 2017 Webinars – ... http://ow.ly/qzAj304HTCi Join thousands of Therapeutic Recreation specialists today Join Now Renew your membership today ...

  5. Therapeutic drug levels

    MedlinePlus

    ... medlineplus.gov/ency/article/003430.htm Therapeutic drug levels To use the sharing features on this page, please enable JavaScript. Therapeutic drug levels are lab tests to look for the presence ...

  6. Model-based clustered-dot screening

    NASA Astrophysics Data System (ADS)

    Kim, Sang Ho

    2006-01-01

    I propose a halftone screen design method based on a human visual system model and the characteristics of the electro-photographic (EP) printer engine. Generally, screen design methods based on human visual models produce dispersed-dot type screens while design methods considering EP printer characteristics generate clustered-dot type screens. In this paper, I propose a cost function balancing the conflicting characteristics of the human visual system and the printer. By minimizing the obtained cost function, I design a model-based clustered-dot screen using a modified direct binary search algorithm. Experimental results demonstrate the superior quality of the model-based clustered-dot screen compared to a conventional clustered-dot screen.

  7. Model Based Testing for Agent Systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhiyong; Thangarajah, John; Padgham, Lin

    Although agent technology is gaining world wide popularity, a hindrance to its uptake is the lack of proper testing mechanisms for agent based systems. While many traditional software testing methods can be generalized to agent systems, there are many aspects that are different and which require an understanding of the underlying agent paradigm. In this paper we present certain aspects of a testing framework that we have developed for agent based systems. The testing framework is a model based approach using the design models of the Prometheus agent development methodology. In this paper we focus on model based unit testing and identify the appropriate units, present mechanisms for generating suitable test cases and for determining the order in which the units are to be tested, present a brief overview of the unit testing process and an example. Although we use the design artefacts from Prometheus the approach is suitable for any plan and event based agent system.

  8. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  9. Efficient Model-Based Diagnosis Engine

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  10. Enzyme therapeutics for systemic detoxification.

    PubMed

    Liu, Yang; Li, Jie; Lu, Yunfeng

    2015-08-01

    Life relies on numerous biochemical processes working synergistically and correctly. Certain substances disrupt these processes, inducing living organism into an abnormal state termed intoxication. Managing intoxication usually requires interventions, which is referred as detoxification. Decades of development on detoxification reveals the potential of enzymes as ideal therapeutics and antidotes, because their high substrate specificity and catalytic efficiency are essential for clearing intoxicating substances without adverse effects. However, intrinsic shortcomings of enzymes including low stability and high immunogenicity are major hurdles, which could be overcome by delivering enzymes with specially designed nanocarriers. Extensive investigations on protein delivery indicate three types of enzyme-nanocarrier architectures that show more promise than others for systemic detoxification, including liposome-wrapped enzymes, polymer-enzyme conjugates, and polymer-encapsulated enzymes. This review highlights recent advances in these nano-architectures and discusses their applications in systemic detoxifications. Therapeutic potential of various enzymes as well as associated challenges in achieving effective delivery of therapeutic enzymes will also be discussed.

  11. Model-based neuroimaging for cognitive computing.

    PubMed

    Poznanski, Roman R

    2009-09-01

    The continuity of the mind is suggested to mean the continuous spatiotemporal dynamics arising from the electrochemical signature of the neocortex: (i) globally through volume transmission in the gray matter as fields of neural activity, and (ii) locally through extrasynaptic signaling between fine distal dendrites of cortical neurons. If the continuity of dynamical systems across spatiotemporal scales defines a stream of consciousness then intentional metarepresentations as templates of dynamic continuity allow qualia to be semantically mapped during neuroimaging of specific cognitive tasks. When interfaced with a computer, such model-based neuroimaging requiring new mathematics of the brain will begin to decipher higher cognitive operations not possible with existing brain-machine interfaces.

  12. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  13. Model-based Tomographic Reconstruction Literature Search

    SciTech Connect

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  14. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  15. Treatment Ideology and Correctional Bureaucracy: A Study of Organizational Change.

    ERIC Educational Resources Information Center

    Martinson, Robert Magnus

    A study was made of organizational change induced by a staff training project in six correctional institutions for youth in the California system, which is currently engaged in introducing "therapeutic community" into correctional facilities. Part I described and evaluated a federally financed training project. The "resource model" of training was…

  16. Eyeglasses for Vision Correction

    MedlinePlus

    ... Stories Español Eye Health / Glasses & Contacts Eyeglasses for Vision Correction Dec. 12, 2015 Wearing eyeglasses is an easy way to correct refractive errors. Improving your vision with eyeglasses offers the opportunity to select from ...

  17. Illinois Corrections Project Report

    ERIC Educational Resources Information Center

    Hungerford, Jack

    1974-01-01

    The Illinois Corrections Project for Law-Focused Education, which brings law-focused curriculum into corrections institutions, was initiated in 1973 with a summer institute and includes programs in nine particpating institutions. (JH)

  18. Sandboxes for Model-Based Inquiry

    NASA Astrophysics Data System (ADS)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  19. An overview of correctional psychiatry.

    PubMed

    Metzner, Jeffrey; Dvoskin, Joel

    2006-09-01

    Supermax facilities may be an unfortunate and unpleasant necessity in modern corrections. Because of the serious dangers posed by prison gangs, they are unlikely to disappear completely from the correctional landscape any time soon. But such units should be carefully reserved for those inmates who pose the most serious danger to the prison environment. Further, the constitutional duty to provide medical and mental health care does not end at the supermax door. There is a great deal of common ground between the opponents of such environments and those who view them as a necessity. No one should want these expensive beds to be used for people who could be more therapeutically and safely managed in mental health treatment environments. No one should want people with serious mental illnesses to be punished for their symptoms. Finally, no one wants these units to make people more, instead of less, dangerous. It is in everyone's interests to learn as much as possible about the potential of these units for good and for harm. Corrections is a profession, and professions base their practices on data. If we are to avoid the most egregious and harmful effects of supermax confinement, we need to understand them far better than we currently do. Though there is a role for advocacy from those supporting or opposed to such environments, there is also a need for objective, scientifically rigorous study of these units and the people who live there.

  20. Model-based Processing of Micro-cantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Lee, C L; Rudd, R E; Burnham, A K

    2004-11-17

    We develop a model-based processor (MBP) for a micro-cantilever array sensor to detect target species in solution. After discussing the generalized framework for this problem, we develop the specific model used in this study. We perform a proof-of-concept experiment, fit the model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest: (1) averaged deflection data, and (2) multi-channel data. In both cases the evaluation proceeds by first performing a model-based parameter estimation to extract the model parameters, next performing a Gauss-Markov simulation, designing the optimal MBP and finally applying it to measured experimental data. The simulation is used to evaluate the performance of the MBP in the multi-channel case and compare it to a ''smoother'' (''averager'') typically used in this application. It was shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, though it includes a correctable systematic bias error. The project's primary accomplishment was the successful application of model-based processing to signals from micro-cantilever arrays: 40-60 dB improvement vs. the smoother algorithm was demonstrated. This result was achieved through the development of appropriate mathematical descriptions for the chemical and mechanical phenomena, and incorporation of these descriptions directly into the model-based signal processor. A significant challenge was the development of the framework which would maximize the usefulness of the signal processing algorithms while ensuring the accuracy of the mathematical description of the chemical-mechanical signal. Experimentally, the difficulty was to identify and characterize the non

  1. Neonaticide: an appropriate application for therapeutic jurisprudence?

    PubMed

    Schwartz, L L; Isser, N K

    2001-01-01

    Might therapeutic jurisprudence, a perspective that attempts to study interaction between the legal and mental health disciplines, be brought to bear effectively with respect to neonaticide, the murder of a newborn infant in the first 24 hours of its life? This is a crime that leads to sentencing that is now rarely therapeutic, rehabilitative, or corrective. An examination of the crime, its motives, and its perpetrators precedes a discussion of ways in which the mental health viewpoint in this matter might be brought to the active attention of the courts in order to promote sentencing that is appropriate to both the crime and the transgressor.

  2. Teaching Politically Correct Language

    ERIC Educational Resources Information Center

    Tsehelska, Maryna

    2006-01-01

    This article argues that teaching politically correct language to English learners provides them with important information and opportunities to be exposed to cultural issues. The author offers a brief review of how political correctness became an issue and how being politically correct influences the use of language. The article then presents…

  3. Research in Correctional Rehabilitation.

    ERIC Educational Resources Information Center

    Rehabilitation Services Administration (DHEW), Washington, DC.

    Forty-three leaders in corrections and rehabilitation participated in the seminar planned to provide an indication of the status of research in correctional rehabilitation. Papers include: (1) "Program Trends in Correctional Rehabilitation" by John P. Conrad, (2) "Federal Offenders Rahabilitation Program" by Percy B. Bell and Merlyn Mathews, (3)…

  4. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  5. Model-based vision for space applications

    NASA Technical Reports Server (NTRS)

    Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald

    1992-01-01

    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.

  6. Model-based reconfiguration: Diagnosis and recovery

    NASA Technical Reports Server (NTRS)

    Crow, Judy; Rushby, John

    1994-01-01

    We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.

  7. Fast Algorithms for Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan

    2005-01-01

    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  8. Therapeutic Recreation Practicum Manual.

    ERIC Educational Resources Information Center

    Schneegas, Kay

    This manual provides information on the practicum program offered by Moraine Valley Community College (MVCC) for students in its therapeutic recreation program. Sections I and II outline the rationale and goals for providing practical, on-the-job work experiences for therapeutic recreation students. Section III specifies MVCC's responsibilities…

  9. Cannabis: its therapeutic use.

    PubMed

    Wall, J; Davis, S; Ridgway, S

    This article provides an overview of the issues surrounding the use of cannabis for therapeutic purposes. Examples of some of the ethical issues related to professional practice are discussed. The authors do not advocate legalising cannabis for all, but the therapeutic advantages and disadvantages of using cannabis are highlighted.

  10. Cytokines and therapeutic oligonucleotides.

    PubMed

    Hartmann, G; Bidlingmaier, M; Eigler, A; Hacker, U; Endres, S

    1997-12-01

    Therapeutic oligonucleotides - short strands of synthetic nucleic acids - encompass antisense and aptamer oligonucleotides. Antisense oligonucleotides are designed to bind to target RNA by complementary base pairing and to inhibit translation of the target protein. Antisense oligonucleotides enable specific inhibition of cytokine synthesis. In contrast, aptamer oligonucleotides are able to bind directly to specific proteins. This binding depends on the sequence of the oligonucleotide. Aptamer oligonucleotides with CpG motifs can exert strong immunostimulatory effects. Both kinds of therapeutic oligonucleotides - antisense and aptamer oligonucleotides - provide promising tools to modulate immunological functions. Recently, therapeutic oligonucleotides have moved towards clinical application. An antisense oligonucleotide directed against the proinflammatory intercellular adhesion molecule 1 (ICAM-1) is currently being tested in clinical trials for therapy of inflammatory disease. Immunostimulatory aptamer oligonucleotides are in preclinical development for immunotherapy. In the present review we summarize the application of therapeutic oligonucleotides to modulate immunological functions. We include technological aspects as well as current therapeutic concepts and clinical studies.

  11. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  12. [Diagnostic-therapeutic approach for retroperitoneal tumors].

    PubMed

    Cariati, A

    1993-12-01

    After a careful review of the Literature, diagnostic and therapeutic strategies for Primary Retroperitoneal Tumours (PRT) are reported. The Author analyzes the experience of the Institute of Clinica Chirurgica "R" (Chief: Prof. E. Tosatti) as well as that of Anatomia Chirurgica (Chief: Prof. E. Cariati),--University of Genoa--in the management of PRT, stressing the importance of preoperative staging for a correct surgical approach.

  13. Development of explicit diffraction corrections for absolute measurements of acoustic nonlinearity parameters in the quasilinear regime.

    PubMed

    Jeong, Hyunjo; Zhang, Shuzeng; Cho, Sungjong; Li, Xiongbing

    2016-08-01

    In absolute measurements of acoustic nonlinearity parameters, amplitudes of harmonics must be corrected for diffraction effects. In this study, we develop explicit multi-Gaussian beam (MGB) model-based diffraction corrections for the first three harmonics in weakly nonlinear, axisymmetric sound beams. The effects of making diffraction corrections on nonlinearity parameter estimation are investigated by defining "total diffraction correction (TDC)". The results demonstrate that TDC cannot be neglected even for harmonic generation experiments in the nearfield region. PMID:27186964

  14. Reporting therapeutic discourse in a therapeutic community.

    PubMed

    Chapman, G E

    1988-03-01

    Research in nurses' communications has concentrated on nurse to patient interactions. Those few studies which focus on nurse to nurse communications seem to be generated by a pragmatic and normative concern with effective information sharing. In this paper, which describes one aspect of a larger case study of a hospital-based therapeutic community, the description and analysis of nurses' reports flows not from a normative model of professional practice, but rather an exploration of how professional practice is articulated as discourse in nurses' written accounts. Foucault's ideas about therapeutic discourse inform the theoretical framework of the research. Ethnomethodological concerns with the importance of documentary analysis provide the methodological rationale for examining nurses' 24-hour report documents, as official discourse, reflecting therapeutic practice in this setting. A content analysis of nurses' reports, collected over a period of 4 months, demonstrated the importance of domesticity and ordinary everyday activities in nurses' accounts of hospital life. Disruption to the 'life as usual' domesticity in the community seemed to be associated with admission to and discharge from the hospital when interpersonal and interactional changes between patients occur. It is suggested that nurses in general hospital wards and more orthodox psychiatric settings might usefully consider the impact of admissions and discharges on the group of patients they manage, and make this a discursive focus of their work. PMID:3372900

  15. Biomimetic Particles as Therapeutics

    PubMed Central

    Green, Jordan J.

    2015-01-01

    In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289

  16. Model-based estimation of knee stiffness.

    PubMed

    Pfeifer, Serge; Vallery, Heike; Hardegger, Michael; Riener, Robert; Perreault, Eric J

    2012-09-01

    During natural locomotion, the stiffness of the human knee is modulated continuously and subconsciously according to the demands of activity and terrain. Given modern actuator technology, powered transfemoral prostheses could theoretically provide a similar degree of sophistication and function. However, experimentally quantifying knee stiffness modulation during natural gait is challenging. Alternatively, joint stiffness could be estimated in a less disruptive manner using electromyography (EMG) combined with kinetic and kinematic measurements to estimate muscle force, together with models that relate muscle force to stiffness. Here we present the first step in that process, where we develop such an approach and evaluate it in isometric conditions, where experimental measurements are more feasible. Our EMG-guided modeling approach allows us to consider conditions with antagonistic muscle activation, a phenomenon commonly observed in physiological gait. Our validation shows that model-based estimates of knee joint stiffness coincide well with experimental data obtained using conventional perturbation techniques. We conclude that knee stiffness can be accurately estimated in isometric conditions without applying perturbations, which presents an important step toward our ultimate goal of quantifying knee stiffness during gait.

  17. Model-Based Estimation of Knee Stiffness

    PubMed Central

    Pfeifer, Serge; Vallery, Heike; Hardegger, Michael; Riener, Robert; Perreault, Eric J.

    2013-01-01

    During natural locomotion, the stiffness of the human knee is modulated continuously and subconsciously according to the demands of activity and terrain. Given modern actuator technology, powered transfemoral prostheses could theoretically provide a similar degree of sophistication and function. However, experimentally quantifying knee stiffness modulation during natural gait is challenging. Alternatively, joint stiffness could be estimated in a less disruptive manner using electromyography (EMG) combined with kinetic and kinematic measurements to estimate muscle force, together with models that relate muscle force to stiffness. Here we present the first step in that process, where we develop such an approach and evaluate it in isometric conditions, where experimental measurements are more feasible. Our EMG-guided modeling approach allows us to consider conditions with antagonistic muscle activation, a phenomenon commonly observed in physiological gait. Our validation shows that model-based estimates of knee joint stiffness coincide well with experimental data obtained using conventional perturbation techniques. We conclude that knee stiffness can be accurately estimated in isometric conditions without applying perturbations, which presents an important step towards our ultimate goal of quantifying knee stiffness during gait. PMID:22801482

  18. 3-D model-based vehicle tracking.

    PubMed

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  19. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  20. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  1. 3-D model-based vehicle tracking.

    PubMed

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter. PMID:16238061

  2. [Fast spectral modeling based on Voigt peaks].

    PubMed

    Li, Jin-rong; Dai, Lian-kui

    2012-03-01

    Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612

  3. Engineering antibody therapeutics.

    PubMed

    Chiu, Mark L; Gilliland, Gary L

    2016-06-01

    The successful introduction of antibody-based protein therapeutics into the arsenal of treatments for patients has within a few decades fostered intense innovation in the production and engineering of antibodies. Reviewed here are the methods currently used to produce antibodies along with how our knowledge of the structural and functional characterization of immunoglobulins has resulted in the engineering of antibodies to produce protein therapeutics with unique properties, both biological and biophysical, that are leading to novel therapeutic approaches. Antibody engineering includes the introduction of the antibody combining site (variable regions) into a host of architectures including bi and multi-specific formats that further impact the therapeutic properties leading to further advantages and successes in patient treatment. PMID:27525816

  4. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  5. Global orbit corrections

    SciTech Connect

    Symon, K.

    1987-11-01

    There are various reasons for preferring local (e.g., three bump) orbit correction methods to global corrections. One is the difficulty of solving the mN equations for the required mN correcting bumps, where N is the number of superperiods and m is the number of bumps per superperiod. The latter is not a valid reason for avoiding global corrections, since, we can take advantage of the superperiod symmetry to reduce the mN simultaneous equations to N separate problems, each involving only m simultaneous equations. Previously, I have shown how to solve the general problem when the machine contains unknown magnet errors of known probability distribution; we made measurements of known precision of the orbit displacements at a set of points, and we wish to apply correcting bumps to minimize the weighted rms orbit deviations. In this report, we will consider two simpler problems, using similar methods. We consider the case when we make M beam position measurements per superperiod, and we wish to apply an equal number M of orbit correcting bumps to reduce the measured position errors to zero. We also consider the problem when the number of correcting bumps is less than the number of measurements, and we wish to minimize the weighted rms position errors. We will see that the latter problem involves solving equations of a different form, but involving the same matrices as the former problem.

  6. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  7. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  8. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  9. Contrast image correction method

    NASA Astrophysics Data System (ADS)

    Schettini, Raimondo; Gasparini, Francesca; Corchs, Silvia; Marini, Fabrizio; Capra, Alessandro; Castorina, Alfio

    2010-04-01

    A method for contrast enhancement is proposed. The algorithm is based on a local and image-dependent exponential correction. The technique aims to correct images that simultaneously present overexposed and underexposed regions. To prevent halo artifacts, the bilateral filter is used as the mask of the exponential correction. Depending on the characteristics of the image (piloted by histogram analysis), an automated parameter-tuning step is introduced, followed by stretching, clipping, and saturation preserving treatments. Comparisons with other contrast enhancement techniques are presented. The Mean Opinion Score (MOS) experiment on grayscale images gives the greatest preference score for our algorithm.

  10. MACE: model based analysis of ChIP-exo.

    PubMed

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J; Zimmermann, Michael T; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T; Huang, Haojie; Wilson, Michael D; Kocher, Jean-Pierre A; Li, Wei

    2014-11-10

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors.

  11. Stellar population models based on new generation stellar library

    NASA Astrophysics Data System (ADS)

    Koleva, M.; Vazdekis, A.

    The spectral predictions of stellar population models are not as accurate in the ultra-violet (UV) as in the optical wavelength domain. One of the reasons is the lack of high-quality stellar libraries. The New Generation Stellar Library (NGSL), recently released, represents a significant step towards the improvement of this situation. To prepare NGSL for population synthesis, we determined the atmospheric parameters of its stars, we assessed the precision of the wavelength calibration and characterised its intrinsic resolution. We also measured the Galactic extinction for each of the NGSL stars. For our analyses we used Ulyss, a full spectrum fitting package, fitting the NGSL spectra against the MILES interpolator. As a second step we build preliminary single stellar population models using Vazdekis (2003) synthesis code. We find that the wavelength calibration is precise up to 0.1 px, after correcting a systematic effect in the optical range. The spectral resolution varies from 3 Å in the UV to 10 Å in the near-infrared (NIR), corresponding to a roughly constant reciprocal resolution R=λ/δλ ≈1000 and an instrumental velocity dispersion σ_{ins} ≈ 130 kms. We derived the atmospheric parameters homogeneously. The precision for the FGK stars is 42 K, 0.24 and 0.09 dex for teff, logg and feh, respectively. The corresponding mean errors are 150 K, 0.50 and 0.48 dex for the M stars, and for the OBA stars they are 4.5 percent, 0.44 and 0.18 dex. The comparison with the literature shows that our results are not biased. Our first version of models compares well with models based on optical libraries, having the advantages to be free from artifacts due to the atmosphere. In future we will fine-tune our models by comparing to different models and observations of globular clusters.

  12. Correcting Illumina data.

    PubMed

    Molnar, Michael; Ilie, Lucian

    2015-07-01

    Next-generation sequencing technologies revolutionized the ways in which genetic information is obtained and have opened the door for many essential applications in biomedical sciences. Hundreds of gigabytes of data are being produced, and all applications are affected by the errors in the data. Many programs have been designed to correct these errors, most of them targeting the data produced by the dominant technology of Illumina. We present a thorough comparison of these programs. Both HiSeq and MiSeq types of Illumina data are analyzed, and correcting performance is evaluated as the gain in depth and breadth of coverage, as given by correct reads and k-mers. Time and memory requirements, scalability and parallelism are considered as well. Practical guidelines are provided for the effective use of these tools. We also evaluate the efficiency of the current state-of-the-art programs for correcting Illumina data and provide research directions for further improvement.

  13. 75 FR 68405 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...'' (Presidential Sig.) [FR Doc. C1-2010-27668 Filed 11-5-10; 8:45 am] Billing Code 1505-01-D ..., 2010--Continuation of U.S. Drug Interdiction Assistance to the Government of Colombia Correction...

  14. 78 FR 73377 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-06

    .... Drug Interdiction Assistance to the Government of Colombia''. (Presidential Sig.) [FR Doc. C1-2013...--Continuation of U.S. Drug Interdiction Assistance to the Government of Colombia Correction In...

  15. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  16. Pluristem Therapeutics, Inc.

    PubMed

    Prather, William

    2008-01-01

    Pluristem Therapeutics, Inc., based in Haifa, Israel, is a regenerative, biotherapeutics Company dedicated to the commercialization of nonpersonalized (allogeneic) cell therapy products. The Company is expanding noncontroversial placental-derived mesenchymal stem cells via a proprietary 3D process, named PluriX, into therapeutics for a variety of degenerative, malignant and autoimmune disorders. Pluristem will be conducting Phase I trials in the USA with its first product, PLX-I, which addresses the global shortfall of matched tissue for bone marrow transplantation by improving the engraftment of hematopoietic stem cells contained in umbilical cord blood. PMID:18154467

  17. DELIVERY OF THERAPEUTIC PROTEINS

    PubMed Central

    Pisal, Dipak S.; Kosloski, Matthew P.; Balu-Iyer, Sathy V.

    2009-01-01

    The safety and efficacy of protein therapeutics are limited by three interrelated pharmaceutical issues, in vitro and in vivo instability, immunogenicity and shorter half-lives. Novel drug modifications for overcoming these issues are under investigation and include covalent attachment of poly(ethylene glycol) (PEG), polysialic acid, or glycolic acid, as well as developing new formulations containing nanoparticulate or colloidal systems (e.g. liposomes, polymeric microspheres, polymeric nanoparticles). Such strategies have the potential to develop as next generation protein therapeutics. This review includes a general discussion on these delivery approaches. PMID:20049941

  18. Advances in Therapeutic Cholangioscopy

    PubMed Central

    Moura, Renata Nobre; de Moura, Eduardo Guimarães Hourneaux

    2016-01-01

    Nowadays, cholangioscopy is an established modality in diagnostic and treatment of pancreaticobiliary diseases. The more widespread use and the recent development of new technologies and accessories had renewed the interest of endoscopic visualization of the biliary tract, increasing the range of indications and therapeutic procedures, such as diagnostic of indeterminate biliary strictures, lithotripsy of difficult bile duct stones, ablative techniques for intraductal malignancies, removal of foreign bodies and gallbladder drainage. These endoscopic interventions will probably be the last frontier in the near future. This paper presents the new advances in therapeutic cholangioscopy, focusing on the current clinical applications and on research areas. PMID:27403156

  19. Adaptable DC offset correction

    NASA Technical Reports Server (NTRS)

    Golusky, John M. (Inventor); Muldoon, Kelly P. (Inventor)

    2009-01-01

    Methods and systems for adaptable DC offset correction are provided. An exemplary adaptable DC offset correction system evaluates an incoming baseband signal to determine an appropriate DC offset removal scheme; removes a DC offset from the incoming baseband signal based on the appropriate DC offset scheme in response to the evaluated incoming baseband signal; and outputs a reduced DC baseband signal in response to the DC offset removed from the incoming baseband signal.

  20. Quantum Error Correction

    NASA Astrophysics Data System (ADS)

    Lidar, Daniel A.; Brun, Todd A.

    2013-09-01

    Prologue; Preface; Part I. Background: 1. Introduction to decoherence and noise in open quantum systems Daniel Lidar and Todd Brun; 2. Introduction to quantum error correction Dave Bacon; 3. Introduction to decoherence-free subspaces and noiseless subsystems Daniel Lidar; 4. Introduction to quantum dynamical decoupling Lorenza Viola; 5. Introduction to quantum fault tolerance Panos Aliferis; Part II. Generalized Approaches to Quantum Error Correction: 6. Operator quantum error correction David Kribs and David Poulin; 7. Entanglement-assisted quantum error-correcting codes Todd Brun and Min-Hsiu Hsieh; 8. Continuous-time quantum error correction Ognyan Oreshkov; Part III. Advanced Quantum Codes: 9. Quantum convolutional codes Mark Wilde; 10. Non-additive quantum codes Markus Grassl and Martin Rötteler; 11. Iterative quantum coding systems David Poulin; 12. Algebraic quantum coding theory Andreas Klappenecker; 13. Optimization-based quantum error correction Andrew Fletcher; Part IV. Advanced Dynamical Decoupling: 14. High order dynamical decoupling Zhen-Yu Wang and Ren-Bao Liu; 15. Combinatorial approaches to dynamical decoupling Martin Rötteler and Pawel Wocjan; Part V. Alternative Quantum Computation Approaches: 16. Holonomic quantum computation Paolo Zanardi; 17. Fault tolerance for holonomic quantum computation Ognyan Oreshkov, Todd Brun and Daniel Lidar; 18. Fault tolerant measurement-based quantum computing Debbie Leung; Part VI. Topological Methods: 19. Topological codes Héctor Bombín; 20. Fault tolerant topological cluster state quantum computing Austin Fowler and Kovid Goyal; Part VII. Applications and Implementations: 21. Experimental quantum error correction Dave Bacon; 22. Experimental dynamical decoupling Lorenza Viola; 23. Architectures Jacob Taylor; 24. Error correction in quantum communication Mark Wilde; Part VIII. Critical Evaluation of Fault Tolerance: 25. Hamiltonian methods in QEC and fault tolerance Eduardo Novais, Eduardo Mucciolo and

  1. [Correction of intracranial pressure in patients with traumatic intracranial hemorrhages].

    PubMed

    Virozub, I D; Chipko, S S; Chernovskiĭ, V I; Cherniaev, V A

    1986-01-01

    Therapeutical correction of intracranial pressure changes were conducted in 14 patients suffering from traumatic intracranial hematomas by endolumbar administration of physiological solution. The distinguishing feature of this method is the possibility of continuous control of the intracranial pressure level by means of long-term graphic recording of epidural pressure. This makes it possible to perform endolumbar administration of physiological solution in a dose which is determined by the initial level of epidural intracranial pressure. Therapeutic correction of intracranial pressure by endolumbar injection of physiological solution proved successful in the initial stages of dislocation of the brain and in stable intracranial hypotension.

  2. Antibody Therapeutics in Oncology

    PubMed Central

    Wold, Erik D; Smider, Vaughn V; Felding, Brunhilde H

    2016-01-01

    One of the newer classes of targeted cancer therapeutics is monoclonal antibodies. Monoclonal antibody therapeutics are a successful and rapidly expanding drug class due to their high specificity, activity, favourable pharmacokinetics, and standardized manufacturing processes. Antibodies are capable of recruiting the immune system to attack cancer cells through complement-dependent cytotoxicity or antibody dependent cellular cytotoxicity. In an ideal scenario the initial tumor cell destruction induced by administration of a therapeutic antibody can result in uptake of tumor associated antigens by antigen-presenting cells, establishing a prolonged memory effect. Mechanisms of direct tumor cell killing by antibodies include antibody recognition of cell surface bound enzymes to neutralize enzyme activity and signaling, or induction of receptor agonist or antagonist activity. Both approaches result in cellular apoptosis. In another and very direct approach, antibodies are used to deliver drugs to target cells and cause cell death. Such antibody drug conjugates (ADCs) direct cytotoxic compounds to tumor cells, after selective binding to cell surface antigens, internalization, and intracellular drug release. Efficacy and safety of ADCs for cancer therapy has recently been greatly advanced based on innovative approaches for site-specific drug conjugation to the antibody structure. This technology enabled rational optimization of function and pharmacokinetics of the resulting conjugates, and is now beginning to yield therapeutics with defined, uniform molecular characteristics, and unprecedented promise to advance cancer treatment. PMID:27081677

  3. Developing Therapeutic Listening

    ERIC Educational Resources Information Center

    Lee, Billy; Prior, Seamus

    2013-01-01

    We present an experience-near account of the development of therapeutic listening in first year counselling students. A phenomenological approach was employed to articulate the trainees' lived experiences of their learning. Six students who had just completed a one-year postgraduate certificate in counselling skills were interviewed and the…

  4. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  5. Therapeutic cancer vaccines.

    PubMed

    Acres, Bruce; Paul, Stephane; Haegel-Kronenberger, Helene; Calmels, Bastien; Squiban, Patrick

    2004-02-01

    Therapeutic vaccination against cancer-associated antigens represents an attractive option for cancer therapy in view of the comparatively low toxicity and, so far, excellent safety profile of this treatment. Nevertheless, it is now recognized that the vaccination strategies used for prophylactic vaccinations against infectious diseases cannot necessarily be used for therapeutic cancer vaccination. Cancer patients are usually immunosuppressed, and most cancer-associated antigens are self antigens. Therefore, various immunostimulation techniques are under investigation in an effort to bolster immune systems and to overcome immune tolerance to self antigens. Various strategies to stimulate antigen presentation, T-cell reactivity and innate immune activity are under investigation. Similarly, strategies to produce an immunological 'danger signal' at the site of the tumor itself are under evaluation, as it is recognized that while tumor-specific T-cells can be activated at the site of vaccination, they require appropriate signals to be attracted to a tumor. The detection, evaluation and quantification of specific immune responses generated by vaccination with cancer-associated antigens is another important area of therapeutic cancer vaccine evaluation receiving much attention and novel strategies. Multiple clinical trials have been undertaken to evaluate therapeutic vaccines in patients. Aggressive protocols such as those combining specific stimulation of T-cells and chemotherapy or strategies to block immune regulation are having some success. PMID:15011780

  6. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  7. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  8. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students learning…

  9. Model-Based Software Testing for Object-Oriented Software

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  10. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  11. Emerging Mitochondrial Therapeutic Targets in Optic Neuropathies.

    PubMed

    Lopez Sanchez, M I G; Crowston, J G; Mackey, D A; Trounce, I A

    2016-09-01

    Optic neuropathies are an important cause of blindness worldwide. The study of the most common inherited mitochondrial optic neuropathies, Leber hereditary optic neuropathy (LHON) and autosomal dominant optic atrophy (ADOA) has highlighted a fundamental role for mitochondrial function in the survival of the affected neuron-the retinal ganglion cell. A picture is now emerging that links mitochondrial dysfunction to optic nerve disease and other neurodegenerative processes. Insights gained from the peculiar susceptibility of retinal ganglion cells to mitochondrial dysfunction are likely to inform therapeutic development for glaucoma and other common neurodegenerative diseases of aging. Despite it being a fast-evolving field of research, a lack of access to human ocular tissues and limited animal models of mitochondrial disease have prevented direct retinal ganglion cell experimentation and delayed the development of efficient therapeutic strategies to prevent vision loss. Currently, there are no approved treatments for mitochondrial disease, including optic neuropathies caused by primary or secondary mitochondrial dysfunction. Recent advances in eye research have provided important insights into the molecular mechanisms that mediate pathogenesis, and new therapeutic strategies including gene correction approaches are currently being investigated. Here, we review the general principles of mitochondrial biology relevant to retinal ganglion cell function and provide an overview of the major optic neuropathies with mitochondrial involvement, LHON and ADOA, whilst highlighting the emerging link between mitochondrial dysfunction and glaucoma. The pharmacological strategies currently being trialed to improve mitochondrial dysfunction in these optic neuropathies are discussed in addition to emerging therapeutic approaches to preserve retinal ganglion cell function. PMID:27288727

  12. Aureolegraph internal scattering correction.

    PubMed

    DeVore, John; Villanucci, Dennis; LePage, Andrew

    2012-11-20

    Two methods of determining instrumental scattering for correcting aureolegraph measurements of particulate solar scattering are presented. One involves subtracting measurements made with and without an external occluding ball and the other is a modification of the Langley Plot method and involves extrapolating aureolegraph measurements collected through a large range of solar zenith angles. Examples of internal scattering correction determinations using the latter method show similar power-law dependencies on scattering, but vary by roughly a factor of 8 and suggest that changing aerosol conditions during the determinations render this method problematic. Examples of corrections of scattering profiles using the former method are presented for a range of atmospheric particulate layers from aerosols to cumulus and cirrus clouds.

  13. Aureolegraph internal scattering correction.

    PubMed

    DeVore, John; Villanucci, Dennis; LePage, Andrew

    2012-11-20

    Two methods of determining instrumental scattering for correcting aureolegraph measurements of particulate solar scattering are presented. One involves subtracting measurements made with and without an external occluding ball and the other is a modification of the Langley Plot method and involves extrapolating aureolegraph measurements collected through a large range of solar zenith angles. Examples of internal scattering correction determinations using the latter method show similar power-law dependencies on scattering, but vary by roughly a factor of 8 and suggest that changing aerosol conditions during the determinations render this method problematic. Examples of corrections of scattering profiles using the former method are presented for a range of atmospheric particulate layers from aerosols to cumulus and cirrus clouds. PMID:23207299

  14. Correction coil cable

    DOEpatents

    Wang, S.T.

    1994-11-01

    A wire cable assembly adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies for the Superconducting Super Collider. The correction coil cables have wires collected in wire array with a center rib sandwiched therebetween to form a core assembly. The core assembly is surrounded by an assembly housing having an inner spiral wrap and a counter wound outer spiral wrap. An alternate embodiment of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable on a particle tube in a particle tube assembly. 7 figs.

  15. Corrections and clarifications.

    PubMed

    1994-11-11

    The 1994 and 1995 federal science budget appropriations for two of the activities were inadvertently transposed in a table that accompanied the article "Hitting the President's target is mixed blessing for agencies" by Jeffrey Mervis (News & Comment, 14 Oct., p. 211). The correct figures for Defense Department spending on university research are $1.460 billion in 1994 and $1.279 billion in 1995; for research and development at NASA, the correct figures are $9.455 billion in 1994 and $9.824 billion in 1995.

  16. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Optical measurements of range and elevation angle are distorted by the earth's atmosphere. High precision refraction correction equations are presented which are ideally suited for surveying because their inputs are optically measured range and optically measured elevation angle. The outputs are true straight line range and true geometric elevation angle. The 'short distances' used in surveying allow the calculations of true range and true elevation angle to be quickly made using a programmable pocket calculator. Topics covered include the spherical form of Snell's Law; ray path equations; and integrating the equations. Short-, medium-, and long-range refraction corrections are presented in tables.

  17. Therapeutic antibodies against cancer

    PubMed Central

    Adler, Mark J.; Dimitrov, Dimiter S.

    2012-01-01

    Antibody-based therapeutics against cancer are highly successful in clinic and currently enjoy unprecedented recognition of their potential; 13 monoclonal antibodies (mAbs) have been approved for clinical use in the European Union and in the United States (one, mylotarg, was withdrawn from market in 2010). Three of the mAbs (bevacizumab, rituximab, trastuzumab) are in the top six selling protein therapeutics with sales in 2010 of more than $5 bln each. Hundreds of mAbs including bispecific mAbs and multispecific fusion proteins, mAbs conjugated with small molecule drugs and mAbs with optimized pharmacokinetics are in clinical trials. However, challenges remain and it appears that deeper understanding of mechanisms is needed to overcome major problems including resistance to therapy, access to targets, complexity of biological systems and individual variations. PMID:22520975

  18. Therapeutic cancer vaccines.

    PubMed

    Melief, Cornelis J M; van Hall, Thorbald; Arens, Ramon; Ossendorp, Ferry; van der Burg, Sjoerd H

    2015-09-01

    The clinical benefit of therapeutic cancer vaccines has been established. Whereas regression of lesions was shown for premalignant lesions caused by HPV, clinical benefit in cancer patients was mostly noted as prolonged survival. Suboptimal vaccine design and an immunosuppressive cancer microenvironment are the root causes of the lack of cancer eradication. Effective cancer vaccines deliver concentrated antigen to both HLA class I and II molecules of DCs, promoting both CD4 and CD8 T cell responses. Optimal vaccine platforms include DNA and RNA vaccines and synthetic long peptides. Antigens of choice include mutant sequences, selected cancer testis antigens, and viral antigens. Drugs or physical treatments can mitigate the immunosuppressive cancer microenvironment and include chemotherapeutics, radiation, indoleamine 2,3-dioxygenase (IDO) inhibitors, inhibitors of T cell checkpoints, agonists of selected TNF receptor family members, and inhibitors of undesirable cytokines. The specificity of therapeutic vaccination combined with such immunomodulation offers an attractive avenue for the development of future cancer therapies.

  19. Therapeutic Hypothermia for Neuroprotection

    PubMed Central

    Karnatovskaia, Lioudmila V.; Wartenberg, Katja E.

    2014-01-01

    The earliest recorded application of therapeutic hypothermia in medicine spans about 5000 years; however, its use has become widespread since 2002, following the demonstration of both safety and efficacy of regimens requiring only a mild (32°C-35°C) degree of cooling after cardiac arrest. We review the mechanisms by which hypothermia confers neuroprotection as well as its physiological effects by body system and its associated risks. With regard to clinical applications, we present evidence on the role of hypothermia in traumatic brain injury, intracranial pressure elevation, stroke, subarachnoid hemorrhage, spinal cord injury, hepatic encephalopathy, and neonatal peripartum encephalopathy. Based on the current knowledge and areas undergoing or in need of further exploration, we feel that therapeutic hypothermia holds promise in the treatment of patients with various forms of neurologic injury; however, additional quality studies are needed before its true role is fully known. PMID:24982721

  20. Toxicity of therapeutic nanoparticles.

    PubMed

    Maurer-Jones, Melissa A; Bantz, Kyle C; Love, Sara A; Marquis, Bryce J; Haynes, Christy L

    2009-02-01

    A total of six nanotherapeutic formulations are already approved for medical use and more are in the approval pipeline currently. Despite the massive research effort in nanotherapeutic materials, there is relatively little information about the toxicity of these materials or the tools needed to assess this toxicity. Recently, the scientific community has begun to respond to the paucity of information by investing in the field of nanoparticle toxicology. This review is intended to provide an overview of the techniques needed to assess toxicity of these therapeutic nanoparticles and to summarize the current state of the field. We begin with background on the toxicological assessment techniques used currently as well as considerations in nanoparticle dosing. The toxicological research overview is divided into the most common applications of therapeutic nanoparticles: drug delivery, photodynamic therapy and bioimaging. We end with a perspective section discussing the current technological gaps and promising research aimed at addressing those gaps.

  1. Therapeutic advances in immunosuppression.

    PubMed Central

    Thomson, A W; Forrester, J V

    1994-01-01

    Immunosuppressive therapy is appropriate for the prevention or reversal of allograft rejection, and for the treatment of autoimmune disorders and allergic disease. Recent advances in our understanding of the cellular and molecular mechanisms that regulate immune responses have paralleled elucidation of the modes of action of a variety of therapeutic immunosuppressive agents, both 'old' and new. These developments have identified potential targets for more refined and specific intervention strategies that are now being tested in the clinic. PMID:7994898

  2. Proteases as therapeutics

    PubMed Central

    Craik, Charles S.; Page, Michael J.; Madison, Edwin L.

    2015-01-01

    Proteases are an expanding class of drugs that hold great promise. The U.S. FDA (Food and Drug Administration) has approved 12 protease therapies, and a number of next generation or completely new proteases are in clinical development. Although they are a well-recognized class of targets for inhibitors, proteases themselves have not typically been considered as a drug class despite their application in the clinic over the last several decades; initially as plasma fractions and later as purified products. Although the predominant use of proteases has been in treating cardiovascular disease, they are also emerging as useful agents in the treatment of sepsis, digestive disorders, inflammation, cystic fibrosis, retinal disorders, psoriasis and other diseases. In the present review, we outline the history of proteases as therapeutics, provide an overview of their current clinical application, and describe several approaches to improve and expand their clinical application. Undoubtedly, our ability to harness proteolysis for disease treatment will increase with our understanding of protease biology and the molecular mechanisms responsible. New technologies for rationally engineering proteases, as well as improved delivery options, will expand greatly the potential applications of these enzymes. The recognition that proteases are, in fact, an established class of safe and efficacious drugs will stimulate investigation of additional therapeutic applications for these enzymes. Proteases therefore have a bright future as a distinct therapeutic class with diverse clinical applications. PMID:21406063

  3. Model-Based Reasoning in Humans Becomes Automatic with Training.

    PubMed

    Economides, Marcos; Kurth-Nelson, Zeb; Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J

    2015-09-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  4. Model-Based Reasoning in Humans Becomes Automatic with Training

    PubMed Central

    Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J.

    2015-01-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load—a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  5. Issues in Correctional Training and Casework. Correctional Monograph.

    ERIC Educational Resources Information Center

    Wolford, Bruce I., Ed.; Lawrenz, Pam, Ed.

    The eight papers contained in this monograph were drawn from two national meetings on correctional training and casework. Titles and authors are: "The Challenge of Professionalism in Correctional Training" (Michael J. Gilbert); "A New Perspective in Correctional Training" (Jack Lewis); "Reasonable Expectations in Correctional Officer Training:…

  6. Space charge stopband correction

    SciTech Connect

    Huang, Xiaobiao; Lee, S.Y.; /Indiana U.

    2005-09-01

    It is speculated that the space charge effect cause beam emittance growth through the resonant envelope oscillation. Based on this theory, we propose an approach, called space charge stopband correction, to reduce such emittance growth by compensation of the half-integer stopband width of the resonant oscillation. It is illustrated with the Fermilab Booster model.

  7. Counselor Education for Corrections.

    ERIC Educational Resources Information Center

    Parsigian, Linda

    Counselor education programs most often prepare their graduates to work in either a school setting, anywhere from the elementary level through higher education, or a community agency. There is little indication that counselor education programs have seriously undertaken the task of training counselors to enter the correctional field. If…

  8. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  9. 75 FR 68409 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Documents#0;#0; ] Presidential Determination No. 2010-14 of September 3, 2010--Unexpected Urgent Refugee And... on page 67015 in the issue of Monday, November 1, 2010, make the following correction: On page 67015, the Presidential Determination number should read ``2010-14'' (Presidential Sig.) [FR Doc....

  10. 75 FR 68407 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... Documents#0;#0; ] Presidential Determination No. 2010-12 of August 26, 2010--Unexpected Urgent Refugee and... beginning on page 67013 in the issue of Monday, November 1, 2010, make the following correction: On page 67013, the Presidential Determination number should read ``2010-12'' (Presidential Sig.) [FR Doc....

  11. Clinical Utility and Safety of a Model-Based Patient-Tailored Dose of Vancomycin in Neonates

    PubMed Central

    Leroux, Stéphanie; Jacqz-Aigrain, Evelyne; Biran, Valérie; Lopez, Emmanuel; Madeleneau, Doriane; Wallon, Camille; Zana-Taïeb, Elodie; Virlouvet, Anne-Laure; Rioualen, Stéphane

    2016-01-01

    Pharmacokinetic modeling has often been applied to evaluate vancomycin pharmacokinetics in neonates. However, clinical application of the model-based personalized vancomycin therapy is still limited. The objective of the present study was to evaluate the clinical utility and safety of a model-based patient-tailored dose of vancomycin in neonates. A model-based vancomycin dosing calculator, developed from a population pharmacokinetic study, has been integrated into the routine clinical care in 3 neonatal intensive care units (Robert Debré, Cochin Port Royal, and Clocheville hospitals) between 2012 and 2014. The target attainment rate, defined as the percentage of patients with a first therapeutic drug monitoring serum vancomycin concentration achieving the target window of 15 to 25 mg/liter, was selected as an endpoint for evaluating the clinical utility. The safety evaluation was focused on nephrotoxicity. The clinical application of the model-based patient-tailored dose of vancomycin has been demonstrated in 190 neonates. The mean (standard deviation) gestational and postnatal ages of the study population were 31.1 (4.9) weeks and 16.7 (21.7) days, respectively. The target attainment rate increased from 41% to 72% without any case of vancomycin-related nephrotoxicity. This proof-of-concept study provides evidence for integrating model-based antimicrobial therapy in neonatal routine care. PMID:26787690

  12. Atmospheric Correction Algorithm for Hyperspectral Imagery

    SciTech Connect

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolute calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.

  13. Distributed real-time model-based diagnosis

    NASA Technical Reports Server (NTRS)

    Barrett, A. C.; Chung, S. H.

    2003-01-01

    This paper presents an approach to onboard anomaly diagnosis that combines the simplicity and real-time guarantee of a rule-based diagnosis system with the specification ease and coverage guarantees of a model-based diagnosis system.

  14. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  15. Using Ground Spectral Irradiance for Model Correction of AVIRIS Data

    NASA Technical Reports Server (NTRS)

    Goetz, Alexander F. H.; Heidebrecht, Kathleen B.; Kindel, Bruce; Boardman, Joseph W.

    1998-01-01

    Over the last decade a series of techniques has been developed to correct hyperspectral imaging sensor data to apparent surface reflectance. The techniques range from the empirical line method that makes use of ground target measurements to model-based methods such as ATREM that derive parameters from the data themselves to convert radiance to reflectance, and combinations of the above. Here we describe a technique that combines ground measurements of spectral irradiance with existing radiative transfer models to derive the model equivalent of an empirical line method correction without the need for uniform ground targets of different reflectance.

  16. Psychodynamic Perspective on Therapeutic Boundaries

    PubMed Central

    Bridges, Nancy A.

    1999-01-01

    Discussion of boundaries in therapeutic work most often focuses on boundary maintenance, risk management factors, and boundary violations. The psychodynamic meaning and clinical management of boundaries in therapeutic relationships remains a neglected area of discourse. Clinical vignettes will illustrate a psychodynamic, developmental-relational perspective using boundary dilemmas to deepen and advance the therapeutic process. This article contributes to the dialogue about the process of making meaning and constructing therapeutically useful and creative boundaries that further the psychotherapeutic process. PMID:10523432

  17. Frankincense--therapeutic properties.

    PubMed

    Al-Yasiry, Ali Ridha Mustafa; Kiczorowska, Bożena

    2016-01-01

    Recently, increasing interest in natural dietary and therapeutic preparations used as dietary supplements has been observed. One of them is frankincense. This traditional medicine of the East is believed to have anti-inflammatory, expectorant, antiseptic, and even anxiolytic and anti-neurotic effects. The present study aims to verify the reported therapeutic properties of Boswellia resin and describe its chemical composition based on available scientific studies. The main component of frankincense is oil (60%). It contains mono- (13%) and diterpenes (40%) as well as ethyl acetate (21.4%), octyl acetate (13.4%) and methylanisole (7.6%). The highest biological activity among terpenes is characteristic of 11-keto-ß-acetyl-beta-boswellic acid, acetyl-11-keto-ß-boswellic acid and acetyl-α-boswellic acid. Contemporary studies have shown that resin indeed has an analgesic, tranquilising and anti-bacterial effects. From the point of view of therapeutic properties, extracts from Boswellia serrata and Boswellia carterii are reported to be particularly useful. They reduce inflammatory conditions in the course of rheumatism by inhibiting leukocyte elastase and degrading glycosaminoglycans. Boswellia preparations inhibit 5-lipoxygenase and prevent the release of leukotrienes, thus having an anti-inflammatory effect in ulcerative colitis, irritable bowel syndrome, bronchitis and sinusitis. Inhalation and consumption of Boswellia olibanum reduces the risk of asthma. In addition, boswellic acids have an antiproliferative effect on tumours. They inhibit proliferation of tumour cells of the leukaemia and glioblastoma subset. They have an anti-tumour effect since they inhibit topoisomerase I and II-alpha and stimulate programmed cell death (apoptosis). PMID:27117114

  18. Cystic Fibrosis Therapeutics

    PubMed Central

    Ramsey, Bonnie W.

    2013-01-01

    A great deal of excitement and hope has followed the successful trials and US Food and Drug Administration approval of the drug ivacaftor (Kalydeco), the first therapy available that targets the underlying defect that causes cystic fibrosis (CF). Although this drug has currently demonstrated a clinical benefit for a small minority of the CF population, the developmental pathway established by ivacaftor paves the way for other CF transmembrane conductance regulator (CFTR) modulators that may benefit many more patients. In addition to investigating CFTR modulators, researchers are actively developing numerous other innovative CF therapies. In this review, we use the catalog of treatments currently under evaluation with the support of the Cystic Fibrosis Foundation, known as the Cystic Fibrosis Foundation Therapeutics Pipeline, as a platform to discuss the variety of candidate treatments for CF lung disease that promise to improve CF care. Many of these approaches target the individual components of the relentless cycle of airway obstruction, inflammation, and infection characteristic of lung disease in CF, whereas others are aimed directly at the gene defect, or the resulting dysfunctional protein, that instigates this cycle. We discuss how new findings from the laboratory have informed not only the development of novel therapeutics, but also the rationales for their use and the outcomes used to measure their effects. By reviewing the breadth of candidate treatments currently in development, as well as the recent progress in CF therapies reflected by the evolution of the therapeutics pipeline over the past few years, we hope to build upon the optimism and anticipation generated by the recent success of Kalydeco. PMID:23276843

  19. Dispersive wave processing: a model-based solution

    SciTech Connect

    Candy, J.V.; Chambers, D.C.

    1996-10-01

    Wave propagation through various media represents a significant problem in many applications in acoustics and electromagnetics especially when the medium is dispersive. We post a general dispersive wave propagation model that could easily represent many classes of dispersive waves and proceed to develop a model-based processor employing this underlying structure. The general solution to the model-based dispersive wave estimation problem is developed using the Bayesian maximum a posteriori approach which leads to the nonlinear extended Kalman filter processor.

  20. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  1. [Is therapeutic deadlock inevitable?].

    PubMed

    Vignat, Jean-Pierre

    2016-01-01

    Many long-term treatments appear to be an expression of therapeutic deadlock. The situation leads to a questioning of the concept of chronicity and the identification of the determining factors of situations which are apparently blocked, marked by the search for solutions taking a back seat to the taking of action. The interaction between patients' mental apparatus and the care apparatus lies at the heart of the question, interpreted from an institutional, collective and individual perspective, supported by the clinical and psychopathological approach, and the return to the prioritisation of the thought. PMID:27389427

  2. Revitalizing Psychiatric Therapeutics

    PubMed Central

    Hyman, Steven E

    2014-01-01

    Despite high prevalence and enormous unmet medical need, the pharmaceutical industry has recently de-emphasized neuropsychiatric disorders as ‘too difficult' a challenge to warrant major investment. Here I describe major obstacles to drug discovery and development including a lack of new molecular targets, shortcomings of current animal models, and the lack of biomarkers for clinical trials. My major focus, however, is on new technologies and scientific approaches to neuropsychiatric disorders that give promise for revitalizing therapeutics and may thus answer industry's concerns. PMID:24317307

  3. Therapeutic approaches to cellulite.

    PubMed

    Green, Jeremy B; Cohen, Joel L; Kaufman, Joely; Metelitsa, Andrei I; Kaminer, Michael S

    2015-09-01

    Cellulite is a condition that affects the vast majority of women. Although it is of no danger to one's overall health, cellulite can be psychosocially debilitating. Consequently, much research has been devoted to understanding cellulite and its etiopathogenesis. With additional insights into the underlying causes of its clinical presentation, therapeutic modalities have been developed that offer hope to cellulite sufferers. This review examines evidence for topical treatments, noninvasive energy-based devices, and recently developed minimally invasive interventions that may finally provide a solution.

  4. Novel topical therapeutics.

    PubMed

    Bleier, Benjamin S

    2010-06-01

    Intranasal drug delivery is a rapidly growing field that offers the potential for enhanced treatment of local and systemic disease. Novel preclinical screening tools such as in vitro assays and 3-dimensional imaging are currently being used to improve drug design and delivery. In addition, new evidence has emerged underlining the importance of surgical marsupialization of the sinuses to allow for improved topical delivery. Although multiple barriers to administration and absorption exist, implantable therapeutics using new classes of drug-eluting polymers allow for prolonged, site-specific drug delivery and hold great promise in overcoming these obstacles.

  5. [Therapeutic plasmas available worldwide].

    PubMed

    Martinaud, C; Cauet, A; Sailliol, A

    2013-05-01

    Therapeutic plasma is a current product; French guidelines were reviewed in 2012. Connections between more or less closed countries are frequent, during relief disasters as well as in war settings. This is associated with the increasing use of plasma in the management of casualties. Additionally, The real possibility of lack of plasma supply in some countries provides a fundamental interest of the knowledge of foreign blood supply organizations. We present here the main divergences and mutual point between plasmas available worldwide. We present the main characteristics of each product.

  6. microRNA Therapeutics

    PubMed Central

    Broderick, JA; Zamore, PD

    2011-01-01

    MicroRNAs (miRNAs) provide new therapeutic targets for many diseases, while their myriad roles in development and cellular processes make them fascinating to study. We still do not fully understand the molecular mechanisms by which miRNAs regulate gene expression nor do we know the complete repertoire of mRNAs each miRNA regulates. However, recent progress in the development of effective strategies to block miRNAs suggests that anti-miRNA drugs may soon be used in the clinic. PMID:21525952

  7. Therapeutic Endoscopic Ultrasound

    PubMed Central

    Cheriyan, Danny

    2015-01-01

    Endoscopic ultrasound (EUS) technology has evolved dramatically over the past 20 years, from being a supplementary diagnostic aid available only in large medical centers to being a core diagnostic and therapeutic tool that is widely available. Although formal recommendations and practice guidelines have not been developed, there are considerable data supporting the use of EUS for its technical accuracy in diagnosing pancreaticobiliary and gastrointestinal pathology. Endosonography is now routine practice not only for pathologic diagnosis and tumor staging but also for drainage of cystic lesions and celiac plexus neurolysis. In this article, we cover the use of EUS in biliary and pancreatic intervention, ablative therapy, enterostomy, and vascular intervention. PMID:27118942

  8. The Therapeutic Roller Coaster

    PubMed Central

    CHU, JAMES A.

    1992-01-01

    Survivors of severe childhood abuse often encounter profound difficulties. In addition to posttraumatic and dissociative symptomatology, abuse survivors frequently have characterologic problems, particularly regarding self-care and maintaining relationships. Backgrounds of abuse, abandonment, and betrayal are often recapitulated and reenacted in therapy, making the therapeutic experience arduous and confusing for therapists and patients. Efforts must be directed at building an adequate psychotherapeutic foundation before undertaking exploration and abreaction of past traumatic experiences. This discussion sets out a model for treatment of childhood abuse survivors, describing stages of treatment and suggesting interventions. Common treatment dilemmas or "traps" are discussed, with recommendations for their resolution. PMID:22700116

  9. Reduced model-based decision-making in schizophrenia.

    PubMed

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  10. Aberration corrected emittance exchange

    NASA Astrophysics Data System (ADS)

    Nanni, E. A.; Graves, W. S.

    2015-08-01

    Full exploitation of emittance exchange (EEX) requires aberration-free performance of a complex imaging system including active radio-frequency (rf) elements which can add temporal distortions. We investigate the performance of an EEX line where the exchange occurs between two dimensions with normalized emittances which differ by multiple orders of magnitude. The transverse emittance is exchanged into the longitudinal dimension using a double dogleg emittance exchange setup with a five cell rf deflector cavity. Aberration correction is performed on the four most dominant aberrations. These include temporal aberrations that are corrected with higher order magnetic optical elements located where longitudinal and transverse emittance are coupled. We demonstrate aberration-free performance of an EEX line with emittances differing by four orders of magnitude, i.e., an initial transverse emittance of 1 pm-rad is exchanged with a longitudinal emittance of 10 nm-rad.

  11. Correction coil cable

    DOEpatents

    Wang, Sou-Tien

    1994-11-01

    A wire cable assembly (10, 310) adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies (532) for the superconducting super collider. The correction coil cables (10, 310) have wires (14, 314) collected in wire arrays (12, 312) with a center rib (16, 316) sandwiched therebetween to form a core assembly (18, 318 ). The core assembly (18, 318) is surrounded by an assembly housing (20, 320) having an inner spiral wrap (22, 322) and a counter wound outer spiral wrap (24, 324). An alternate embodiment (410) of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable (410) on a particle tube (733) in a particle tube assembly (732).

  12. Surgical correction of brachymetatarsia.

    PubMed

    Bartolomei, F J

    1990-02-01

    Brachymetatarsia describes the condition of an abnormally short metatarsal. Although the condition has been recorded since antiquity, surgical options to correct the deformity have been available for only two decades. Most published procedures involve metaphyseal lengthening with autogenous grafts from different donor sites. The author discusses one such surgical technique. In addition, the author proposes specific criteria for the objective diagnosis of brachymetatarsia. PMID:2406417

  13. Nitrones as Therapeutics

    PubMed Central

    Floyd, Robert A.; Kopke, Richard D.; Choi, Chul-Hee; Foster, Steven B.; Doblas, Sabrina; Towner, Rheal A.

    2008-01-01

    Nitrones have the general chemical formula X-CH=NO-Y. They were first used to trap free radicals in chemical systems and then subsequently in biochemical systems. More recently several nitrones including PBN (α-phenyl-tert-butylnitrone) have been shown to have potent biological activity in many experimental animal models. Many diseases of aging including stroke, cancer development, Parkinson’s disease and Alzheimer’s disease are known to have enhanced levels of free radicals and oxidative stress. Some derivatives of PBN are significantly more potent than PBN and have undergone extensive commercial development in stroke. Recent research has shown that PBN-related nitrones also have anti-cancer activity in several experimental cancer models and have potential as therapeutics in some cancers. Also in recent observations nitrones have been shown to act synergistically in combination with antioxidants in the prevention of acute acoustic noise induced hearing loss. The mechanistic basis of the potent biological activity of PBN-related nitrones is not known. Even though PBN-related nitrones do decrease oxidative stress and oxidative damage, their potent biological anti-inflammatory activity and their ability to alter cellular signaling processes can not readily be explained by conventional notions of free radical trapping biochemistry. This review is focused on our observations and others where the use of selected nitrones as novel therapeutics have been evaluated in experimental models in the context of free radical biochemical and cellular processes considered important in pathologic conditions and age-related diseases. PMID:18793715

  14. Mechanisms of Plasma Therapeutics

    NASA Astrophysics Data System (ADS)

    Graves, David

    2015-09-01

    In this talk, I address research directed towards biomedical applications of atmospheric pressure plasma such as sterilization, surgery, wound healing and anti-cancer therapy. The field has seen remarkable growth in the last 3-5 years, but the mechanisms responsible for the biomedical effects have remained mysterious. It is known that plasmas readily create reactive oxygen species (ROS) and reactive nitrogen species (RNS). ROS and RNS (or RONS), in addition to a suite of other radical and non-radical reactive species, are essential actors in an important sub-field of aerobic biology termed ``redox'' (or oxidation-reduction) biology. It is postulated that cold atmospheric plasma (CAP) can trigger a therapeutic shielding response in tissue in part by creating a time- and space-localized, burst-like form of oxy-nitrosative stress on near-surface exposed cells through the flux of plasma-generated RONS. RONS-exposed surface layers of cells communicate to the deeper levels of tissue via a form of the ``bystander effect,'' similar to responses to other forms of cell stress. In this proposed model of CAP therapeutics, the plasma stimulates a cellular survival mechanism through which aerobic organisms shield themselves from infection and other challenges.

  15. Person-centered Therapeutics

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    A clinician’s effectiveness in treatment depends substantially on his or her attitude toward -- and understanding of -- the patient as a person endowed with self-awareness and the will to direct his or her own future. The assessment of personality in the therapeutic encounter is a crucial foundation for forming an effective working alliance with shared goals. Helping a person to reflect on their personality provides a mirror image of their strengths and weaknesses in adapting to life’s many challenges. The Temperament and Character Inventory (TCI) provides an effective way to describe personality thoroughly and to predict both the positive and negative aspects of health. Strengths and weaknesses in TCI personality traits allow strong predictions of individual differences of all aspects of well-being. Diverse therapeutic techniques, such as diet, exercise, mood self-regulation, meditation, or acts of kindness, influence health and personality development in ways that are largely indistinguishable from one another or from effective allopathic treatments. Hence the development of well-being appears to be the result of activating a synergistic set of mechanisms of well-being, which are expressed as fuller functioning, plasticity, and virtue in adapting to life’s challenges PMID:26052429

  16. Epigenomes as therapeutic targets.

    PubMed

    Hamm, Christopher A; Costa, Fabricio F

    2015-07-01

    Epigenetics is a molecular phenomenon that pertains to heritable changes in gene expression that do not involve changes in the DNA sequence. Epigenetic modifications in a whole genome, known as the epigenome, play an essential role in the regulation of gene expression in both normal development and disease. Traditional epigenetic changes include DNA methylation and histone modifications. Recent evidence reveals that other players, such as non-coding RNAs, may have an epigenetic regulatory role. Aberrant epigenetic signaling is becoming to be known as a central component of human disease, and the reversible nature of the epigenetic modifications provides an exciting opportunity for the development of clinically relevant therapeutics. Current epigenetic therapies provide a clinical benefit through disrupting DNA methyltransferases or histone deacetylases. However, the emergence of next-generation epigenetic therapies provides an opportunity to more effectively disrupt epigenetic disease states. Novel epigenetic therapies may improve drug targeting and drug delivery, optimize dosing schedules, and improve the efficacy of preexisting treatment modalities (chemotherapy, radiation, and immunotherapy). This review discusses the epigenetic mechanisms that contribute to the disease, available epigenetic therapies, epigenetic therapies currently in development, and the potential future use of epigenetic therapeutics in a clinical setting.

  17. AMUM LECTURE: Therapeutic ultrasound

    NASA Astrophysics Data System (ADS)

    Crum, Lawrence A.

    2004-01-01

    The use of ultrasound in medicine is now quite commonplace, especially with the recent introduction of small, portable and relatively inexpensive, hand-held diagnostic imaging devices. Moreover, ultrasound has expanded beyond the imaging realm, with methods and applications extending to novel therapeutic and surgical uses. These applications broadly include: tissue ablation, acoustocautery, lipoplasty, site-specific and ultrasound mediated drug activity, extracorporeal lithotripsy, and the enhancement of natural physiological functions such as wound healing and tissue regeneration. A particularly attractive aspect of this technology is that diagnostic and therapeutic systems can be combined to produce totally non-invasive, imageguided therapy. This general lecture will review a number of these exciting new applications of ultrasound and address some of the basic scientific questions and future challenges in developing these methods and technologies for general use in our society. We shall particularly emphasize the use of High Intensity Focused Ultrasound (HIFU) in the treatment of benign and malignant tumors as well as the introduction of acoustic hemostasis, especially in organs which are difficult to treat using conventional medical and surgical techniques.

  18. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  19. [Therapeutic contact lenses and the advantages of high Dk materials].

    PubMed

    Coral-Ghanem, Cleusa; Ghanem, Vinícius Coral; Ghanem, Ramon Coral

    2008-01-01

    Therapeutic contact lenses are useful in a variety of ocular surface diseases. Their main indications are: to relieve the pain; protect ocular surface; promote corneal healing and epithelial regeneration; seal a leaking corneal wound and deliver ophthalmic drugs on the ocular surface. There are several kinds of lens designs and materials, and their choice is dependent on the specific disease to be treated, the duration of treatment and the physiologic needs of the diseased cornea. Bullous keratopathy, recurrent epithelial erosion syndrome, dry eye and postoperative epithelial defects are amongst their indications. Therapeutic contact lenses should not be indicated in the presence of active infectious keratitis or when the patient is not compliant. Corneal neovascularization, giant papillary conjunctivitis and infectious keratitis are serious complications, which can be prevented by correctly fitting and maintaining the therapeutic contact lenses. Silicon-hydrogel therapeutic contact lenses, due to their higher oxygen permeability, allow extended wear schedules, decreasing the need for frequent lens replacement.

  20. [Therapeutic contact lenses and the advantages of high Dk materials].

    PubMed

    Coral-Ghanem, Cleusa; Ghanem, Vinícius Coral; Ghanem, Ramon Coral

    2008-01-01

    Therapeutic contact lenses are useful in a variety of ocular surface diseases. Their main indications are: to relieve the pain; protect ocular surface; promote corneal healing and epithelial regeneration; seal a leaking corneal wound and deliver ophthalmic drugs on the ocular surface. There are several kinds of lens designs and materials, and their choice is dependent on the specific disease to be treated, the duration of treatment and the physiologic needs of the diseased cornea. Bullous keratopathy, recurrent epithelial erosion syndrome, dry eye and postoperative epithelial defects are amongst their indications. Therapeutic contact lenses should not be indicated in the presence of active infectious keratitis or when the patient is not compliant. Corneal neovascularization, giant papillary conjunctivitis and infectious keratitis are serious complications, which can be prevented by correctly fitting and maintaining the therapeutic contact lenses. Silicon-hydrogel therapeutic contact lenses, due to their higher oxygen permeability, allow extended wear schedules, decreasing the need for frequent lens replacement. PMID:19274406

  1. Hypoxic Conditioning as a New Therapeutic Modality

    PubMed Central

    Verges, Samuel; Chacaroun, Samarmar; Godin-Ribuot, Diane; Baillieul, Sébastien

    2015-01-01

    Preconditioning refers to a procedure by which a single noxious stimulus below the threshold of damage is applied to the tissue in order to increase resistance to the same or even different noxious stimuli given above the threshold of damage. Hypoxic preconditioning relies on complex and active defenses that organisms have developed to counter the adverse consequences of oxygen deprivation. The protection it confers against ischemic attack for instance as well as the underlying biological mechanisms have been extensively investigated in animal models. Based on these data, hypoxic conditioning (consisting in recurrent exposure to hypoxia) has been suggested a potential non-pharmacological therapeutic intervention to enhance some physiological functions in individuals in whom acute or chronic pathological events are anticipated or existing. In addition to healthy subjects, some benefits have been reported in patients with cardiovascular and pulmonary diseases as well as in overweight and obese individuals. Hypoxic conditioning consisting in sessions of intermittent exposure to moderate hypoxia repeated over several weeks may induce hematological, vascular, metabolic, and neurological effects. This review addresses the existing evidence regarding the use of hypoxic conditioning as a potential therapeutic modality, and emphasizes on many remaining issues to clarify and future researches to be performed in the field. PMID:26157787

  2. Therapeutic advances in dystonia.

    PubMed

    Albanese, Alberto; Romito, Luigi M; Calandrella, Daniela

    2015-09-15

    Knowledge on dystonia has greatly improved recently, because of a renewed effort in understanding its cause, pathophysiology, and clinical characterization. Different drug classes traditionally have been used for the symptomatic treatment of dystonia, more recently surpassed by the introduction of botulinum neurotoxins and deep brain stimulation. No curative or disease-modifying treatments are available. Recent knowledge regarding the pathophysiology of inherited dystonias is highlighting new potential treatment strategies. We review therapeutic advances in dystonia that have been published over the last 3 years, particularly regarding oral medications, local injections of botulinum neurotoxins, deep brain stimulation, and transcranial or epidural brain stimulations. We discuss evidence of efficacy, highlight recent advances, and focus on key areas under development. PMID:26301801

  3. [Therapeutic education didactic techniques].

    PubMed

    Valverde, Maite; Vidal, Mercè; Jansa, Margarida

    2012-10-01

    This article includes an introduction to the role of Therapeutic Education for Diabetes treatment according to the recommendations of the American Diabetes Association (ADA), the Diabetes Education Study Group (DESG) of the "European Association for Study of Diabetes (EASD) and the clinical Practice Guidelines (CPG) of the Spanish Ministry of Health. We analyze theoretical models and the differences between teaching vs. learning as well as current trends (including Internet), that can facilitate meaningful learning of people with diabetes and their families and relatives. We analyze the differences, similarities, advantages and disadvantages of individual and group education. Finally, we describe different educational techniques (metaplan, case method, brainstorming, role playing, games, seminars, autobiography, forums, chats,..) applicable to individual, group or virtual education and its application depending on the learning objective.

  4. Aptamers in Therapeutics.

    PubMed

    Parashar, Abhishek

    2016-06-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  5. Therapeutic advances in dystonia.

    PubMed

    Albanese, Alberto; Romito, Luigi M; Calandrella, Daniela

    2015-09-15

    Knowledge on dystonia has greatly improved recently, because of a renewed effort in understanding its cause, pathophysiology, and clinical characterization. Different drug classes traditionally have been used for the symptomatic treatment of dystonia, more recently surpassed by the introduction of botulinum neurotoxins and deep brain stimulation. No curative or disease-modifying treatments are available. Recent knowledge regarding the pathophysiology of inherited dystonias is highlighting new potential treatment strategies. We review therapeutic advances in dystonia that have been published over the last 3 years, particularly regarding oral medications, local injections of botulinum neurotoxins, deep brain stimulation, and transcranial or epidural brain stimulations. We discuss evidence of efficacy, highlight recent advances, and focus on key areas under development.

  6. Therapeutic showering in labor.

    PubMed

    Stark, Mary Ann

    2013-08-01

    While showering is thought to be an effective coping strategy during labor, research on this comfort measure is lacking. The purpose of this study was to measure effectiveness of therapeutic showering on pain, coping, tension, anxiety, relaxation, and fatigue in labor. A quasi-experimental pretest-posttest single group design was conducted in a community hospital. Participants were women who had singleton, uncomplicated pregnancies and were in active labor at term (N = 24). After completing pretest measures, participants took a 30 min shower where they were encouraged to be seated but could choose positions of comfort. There were significant reductions in tension and anxiety and increased relaxation and coping. Showering may be a safe and effective comfort measure for healthy, laboring women who are experiencing tension or anxiety, or having difficulty relaxing or coping with labor. Further research is needed to test the maternal and neonatal outcomes of this nonpharmacologic comfort measure.

  7. [Hypercholesterolemia: a therapeutic approach].

    PubMed

    Moráis López, A; Lama More, R A; Dalmau Serra, J

    2009-05-01

    High blood cholesterol levels represent an important cardiovascular risk factor. Hypercholesterolemia is defined as levels of total cholesterol and low-density lipoprotein cholesterol above 95th percentile for age and gender. For the paediatric population, selective screening is recommended in children older than 2 years who are overweight, with a family history of early cardiovascular disease or whose parents have high cholesterol levels. Initial therapeutic approach includes diet therapy, appropriate physical activity and healthy lifestyle changes. Drug treatment should be considered in children from the age of 10 who, after having followed appropriate diet recommendations, still have very high LDL-cholesterol levels or moderately high levels with concomitant risk factors. In case of extremely high LDL-cholesterol levels, drug treatment should be taken into consideration at earlier ages (8 years old). Modest response is usually observed with bile acid-binding resins. Statins can be considered first-choice drugs, once evidence on their efficacy and safety has been shown.

  8. Aptamers in Therapeutics

    PubMed Central

    2016-01-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  9. Microfabricated therapeutic actuators

    DOEpatents

    Lee, Abraham P.; Northrup, M. Allen; Ciarlo, Dino R.; Krulevitch, Peter A.; Benett, William J.

    1999-01-01

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use.

  10. Microfabricated therapeutic actuators

    DOEpatents

    Lee, A.P.; Northrup, M.A.; Ciarlo, D.R.; Krulevitch, P.A.; Benett, W.J.

    1999-06-15

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use. 8 figs.

  11. On being therapeutic.

    PubMed

    Greben, S E

    1977-11-01

    Psychotherapy is both an art and a science. The art deserves as careful study as does the science. In this paper the author puts forward the view that the effectiveness of psychotherapy is dependent to a marked degree upon certain innate characteristics of the therapist: these include his character structure, his personal values, and his spontaneous personality style. In order to explore this thesis, the author examines what has been written about some successful and well-known psychotherapists, by their patients, their colleagues, and their friends. He concludes that these therapists strongly evidenced the following characteristics: empathy and concern, caring and protectiveness, warmth, therapeutic forcefulness, expectation of improvement, freedom from despair, reliability, friendliness and respectfulness. It is felt that such factors in the therapist must be taken into account in order to achieve a view of psychotherapy which is not reductionistic. PMID:589551

  12. Antibody Engineering and Therapeutics

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Breden, Felix; Scott, Jamie K; Sok, Devin; Pauthner, Matthias; Reichert, Janice M; Helguera, Gustavo; Andrabi, Raiees; Mabry, Robert; Bléry, Mathieu; Voss, James E; Laurén, Juha; Abuqayyas, Lubna; Barghorn, Stefan; Ben-Jacob, Eshel; Crowe, James E; Huston, James S; Johnston, Stephen Albert; Krauland, Eric; Lund-Johansen, Fridtjof; Marasco, Wayne A; Parren, Paul WHI; Xu, Kai Y

    2014-01-01

    The 24th Antibody Engineering & Therapeutics meeting brought together a broad range of participants who were updated on the latest advances in antibody research and development. Organized by IBC Life Sciences, the gathering is the annual meeting of The Antibody Society, which serves as the scientific sponsor. Preconference workshops on 3D modeling and delineation of clonal lineages were featured, and the conference included sessions on a wide variety of topics relevant to researchers, including systems biology; antibody deep sequencing and repertoires; the effects of antibody gene variation and usage on antibody response; directed evolution; knowledge-based design; antibodies in a complex environment; polyreactive antibodies and polyspecificity; the interface between antibody therapy and cellular immunity in cancer; antibodies in cardiometabolic medicine; antibody pharmacokinetics, distribution and off-target toxicity; optimizing antibody formats for immunotherapy; polyclonals, oligoclonals and bispecifics; antibody discovery platforms; and antibody-drug conjugates. PMID:24589717

  13. Actuating critical care therapeutics.

    PubMed

    Stone, David J; Csete, Marie

    2016-10-01

    Viewing the intensive care unit (ICU) as a control system with inputs (patients) and outputs (outcomes), we focus on actuation (therapies) of the system and how to enhance our understanding of status of patients and their trajectory in the ICU. To incorporate the results of these analytics meaningfully, we feel that a reassessment of predictive scoring systems and of ways to optimally characterize and display the patient's "state space" to clinicians is important. Advances in sensing (diagnostics) and computation have not yet led to significantly better actuation, and so we focus on ways that data can be used to improve actuation in the ICU, in particular by following therapeutic burden along with disease severity. This article is meant to encourage discussion about how the critical care community can best deal with the data they see each day, and prepare for recommendations that will inevitably arise from application of major federal and state initiatives in big data analytics and precision medicine.

  14. Race-based therapeutics.

    PubMed

    Yancy, Clyde W

    2008-08-01

    The issue of race in medicine is problematic. Race is not a physiologic grouping, and all persons of a given race do not necessarily share the same clinical phenotype or genetic substrate. Despite clear signals that certain risk factors and diseases vary as a function of race, translating those differences into race-based therapeutics has been awkward and has done little to change the natural history of cardiovascular disease as it affects special populations. Among the varied special populations, the African American population appears to have the most significant and adverse variances for cardiovascular disease as well as worrisome signals that drug responsiveness varies. Recent guideline statements have now acknowledged certain treatment options that are most appropriate for African Americans with cardiovascular disease, especially hypertension and heart failure. As more physiologic markers of disease and drug responsiveness become available, the need for racial designations in medicine may lessen, and therapies can be optimized for all patients without regard to race or ethnicity.

  15. Homocystinuria: Therapeutic approach.

    PubMed

    Kumar, Tarun; Sharma, Gurumayum Suraj; Singh, Laishram Rajendrakumar

    2016-07-01

    Homocystinuria is a disorder of sulfur metabolism pathway caused by deficiency of cystathionine β-synthase (CBS). It is characterized by increased accumulation of homocysteine (Hcy) in the cells and plasma. Increased homocysteine results in various vascular and neurological complications. Present strategies to lower cellular and plasma homocysteine levels include vitamin B6 intake, dietary methionine restriction, betaine supplementation, folate and vitamin B12 administration. However, these strategies are inefficient for treatment of homocystinuria. In recent years, advances have been made towards developing new strategies to treat homocystinuria. These mainly include functional restoration to mutant CBS, enhanced clearance of Hcy from the body, prevention of N-homocysteinylation-induced toxicity and inhibition of homocysteine-induced oxidative stress. In this review, we have exclusively discussed the recent advances that have been achieved towards the treatment of homocystinuria. The review is an attempt to help clinicians in developing effective therapeutic strategies and designing novel drugs against homocystinuria. PMID:27059523

  16. Conduct of therapeutic trials.

    PubMed

    Vaïsse

    1996-06-01

    Ambulatory blood pressure monitoring (ABPM) is now very useful for assessing the blood pressure response to antihypertensive drugs. It gives accurate information on blood pressure profiles and provides more detailed information on first-dose effects, dose-response relationships, and the direction of action of antihypertensive treatment. However, ABPM studies will also allow new questions to be addressed. The reliability of ABPM measurements must receive more attention: validation of the different ABP monitors, evaluation of missed data, standardization of activities during the monitoring period. Concerning these technical problems, it seems reasonable to propose a control of quality of ABPM data in therapeutic trials. As a result of previous studies, it might be argued that ABPM should not obly be used to evaluate the effects of treatment, but also to improve the selection of patients for clinical trials who are hypertensive both in the clinic and during ABPM. Despite a generally good agreement between sthe effects of medication on clinic and ABP when analysed on a group basis, several studies have reported weak, insignificant correlations on an individual basis, indicating discrepancies between clinic and ambulatory pressures. Clinic pressures tend to overestimate the degree of blood pressure control during daily activities. Treatment produces a significant reduction in ABP in the 'true hypertensives', whereas in the other 'white-coat hypertensives' it has no effect. There is also a question of the duration of action of treatment: whether medication should be equally effective throughout the day and night or should be focused on moments when the pressure is highest. The value of blood pressure variability in therapeutic trials is not yet well known, and needs further evaluation. The definition of hypertension and normotension have traditionally been difficult and arbitrary when based on clinic blood pressure measurements, the difficulty is not removed when trying

  17. Onboard image correction

    NASA Technical Reports Server (NTRS)

    Martin, D. R.; Smaulon, A. S.; Hamori, A. S.

    1980-01-01

    A processor architecture for performing onboard geometric and radiometric correction of LANDSAT imagery is described. The design uses a general purpose processor to calculate the distortion values at selected points in the image and a special purpose processor to resample (calculate distortion at each image point and interpolate the intensity) the sensor output data. A distinct special purpose processor is used for each spectral band. Because of the sensor's high output data rate, 80 M bit per second, the special purpose processors use a pipeline architecture. Sizing has been done on both the general and special purpose hardware.

  18. Timebias corrections to predictions

    NASA Technical Reports Server (NTRS)

    Wood, Roger; Gibbs, Philip

    1993-01-01

    The importance of an accurate knowledge of the time bias corrections to predicted orbits to a satellite laser ranging (SLR) observer, especially for low satellites, is highlighted. Sources of time bias values and the optimum strategy for extrapolation are discussed from the viewpoint of the observer wishing to maximize the chances of getting returns from the next pass. What is said may be seen as a commercial encouraging wider and speedier use of existing data centers for mutually beneficial exchange of time bias data.

  19. Applying model-based diagnostics to space power distribution

    NASA Astrophysics Data System (ADS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-03-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems which perform model-based diagnosis. A model-based diagnostic expert system for a Space Station Freedom electrical power distribution testbed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems, such as the testbed, as well. The expert system was developed using Marple and Lucid Common Lisp running on Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This paper describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  20. Smooth eigenvalue correction

    NASA Astrophysics Data System (ADS)

    Hendrikse, Anne; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-12-01

    Second-order statistics play an important role in data modeling. Nowadays, there is a tendency toward measuring more signals with higher resolution (e.g., high-resolution video), causing a rapid increase of dimensionality of the measured samples, while the number of samples remains more or less the same. As a result the eigenvalue estimates are significantly biased as described by the Marčenko Pastur equation for the limit of both the number of samples and their dimensionality going to infinity. By introducing a smoothness factor, we show that the Marčenko Pastur equation can be used in practical situations where both the number of samples and their dimensionality remain finite. Based on this result we derive methods, one already known and one new to our knowledge, to estimate the sample eigenvalues when the population eigenvalues are known. However, usually the sample eigenvalues are known and the population eigenvalues are required. We therefore applied one of the these methods in a feedback loop, resulting in an eigenvalue bias correction method. We compare this eigenvalue correction method with the state-of-the-art methods and show that our method outperforms other methods particularly in real-life situations often encountered in biometrics: underdetermined configurations, high-dimensional configurations, and configurations where the eigenvalues are exponentially distributed.

  1. Complications of auricular correction

    PubMed Central

    Staindl, Otto; Siedek, Vanessa

    2008-01-01

    The risk of complications of auricular correction is underestimated. There is around a 5% risk of early complications (haematoma, infection, fistulae caused by stitches and granulomae, allergic reactions, pressure ulcers, feelings of pain and asymmetry in side comparison) and a 20% risk of late complications (recurrences, telehone ear, excessive edge formation, auricle fitting too closely, narrowing of the auditory canal, keloids and complete collapse of the ear). Deformities are evaluated less critically by patients than by the surgeons, providing they do not concern how the ear is positioned. The causes of complications and deformities are, in the vast majority of cases, incorrect diagnosis and wrong choice of operating procedure. The choice of operating procedure must be adapted to suit the individual ear morphology. Bandaging technique and inspections and, if necessary, early revision are of great importance for the occurence and progress of early complications, in addition to operation techniques. In cases of late complications such as keloids and auricles that are too closely fitting, unfixed full-thickness skin flaps have proved to be the most successful. Large deformities can often only be corrected to a limited degree of satisfaction. PMID:22073079

  2. Complications of auricular correction.

    PubMed

    Staindl, Otto; Siedek, Vanessa

    2007-01-01

    The risk of complications of auricular correction is underestimated. There is around a 5% risk of early complications (haematoma, infection, fistulae caused by stitches and granulomae, allergic reactions, pressure ulcers, feelings of pain and asymmetry in side comparison) and a 20% risk of late complications (recurrences, telehone ear, excessive edge formation, auricle fitting too closely, narrowing of the auditory canal, keloids and complete collapse of the ear). Deformities are evaluated less critically by patients than by the surgeons, providing they do not concern how the ear is positioned. The causes of complications and deformities are, in the vast majority of cases, incorrect diagnosis and wrong choice of operating procedure. The choice of operating procedure must be adapted to suit the individual ear morphology. Bandaging technique and inspections and, if necessary, early revision are of great importance for the occurence and progress of early complications, in addition to operation techniques. In cases of late complications such as keloids and auricles that are too closely fitting, unfixed full-thickness skin flaps have proved to be the most successful. Large deformities can often only be corrected to a limited degree of satisfaction. PMID:22073079

  3. Stem cells: The Next Therapeutic Frontier

    PubMed Central

    Humes, H. David

    2005-01-01

    Cell therapy is one of the most exciting fields in translational medicine. It stands at the intersection of a variety of rapidly developing scientific disciplines: stem cell biology, immunology, tissue engineering, molecular biology, biomaterials, transplantation biology, regenerative medicine, and clinical research. Cell-based therapy may develop into a new therapeutic platform to treat a vast array of clinical disorders. Blood transfusions and bone marrow transplantation are prime examples of the successful application of cell-based therapeutics; but recent advances in cellular and molecular biology have expanded the potential applications of this approach. Although recombinant genetic engineering to produce a variety of therapeutics such as human erythropoietin and insulin has proven successful, these treatments are unable to completely correct or reverse disease states, because most common disease processes are not due to the deficiency of a single protein but develop due to alterations in the complex interactions of a variety of cell components. In these complex situations, cell-based therapy may be a more successful strategy by providing a dynamic, interactive, and individualized therapeutic approach that responds to the pathophysiological condition of the patient. In this regard, cells may provide innovative methods for drug delivery of biologics, immunotherapy, and tissue regenerative or replacement engineering (1,2). The translation of this discipline to medical practice has tremendous potential, but in many applications technological issues need to be overcome. Since many cell-based indications are already being evaluated in the clinic, the field appears to be on the threshold of a number of successes. This review will focus on our group's use of human stem/progenitor cells in the treatment of acute and chronic renal failure as extensions to the current successful renal substitution processes of hemodialysis and hemofiltration. PMID:16555613

  4. Model-based hierarchical reinforcement learning and human action control.

    PubMed

    Botvinick, Matthew; Weinstein, Ari

    2014-11-01

    Recent work has reawakened interest in goal-directed or 'model-based' choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour.

  5. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  6. Model-based ocean acoustic passive localization. Revision 1

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-06-01

    A model-based approach is developed (theoretically) to solve the passive localization problem. Here the authors investigate the design of a model-based identifier for a shallow water ocean acoustic problem characterized by a normal-mode model. In this problem they show how the processor can be structured to estimate the vertical wave numbers directly from measured pressure-field and sound speed measurements thereby eliminating the need for synthetic aperture processing or even a propagation model solution. Finally, they investigate various special cases of the source localization problem, designing a model-based localizer for each and evaluating the underlying structure with the expectation of gaining more and more insight into the general problem.

  7. Contact Lenses for Vision Correction

    MedlinePlus

    ... Contact Lenses Colored Contact Lenses Contact Lenses for Vision Correction Written by: Kierstan Boyd Reviewed by: Brenda ... on the surface of the eye. They correct vision like eyeglasses do and are safe when used ...

  8. When Does Model-Based Control Pay Off?

    PubMed

    Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J

    2016-08-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  9. [Corrected transposition of the great arteries].

    PubMed

    Alva-Espinosa, Carlos

    2016-01-01

    Corrected transposition of the great arteries is one of the most fascinating entities in congenital heart disease. The apparent corrected condition is only temporal. Over time, most patients develop systemic heart failure, even in the absence of associated lesions. With current imaging studies, precise visualization is achieved in each case though the treatment strategy remains unresolved. In asymptomatic patients or cases without associated lesions, focalized follow-up to assess systemic ventricular function and the degree of tricuspid valve regurgitation is important. In cases with normal ventricular function and mild tricuspid failure, it seems unreasonable to intervene surgically. In patients with significant associated lesions, surgery is indicated. In the long term, the traditional approach may not help tricuspid regurgitation and systemic ventricular failure. Anatomical correction is the proposed alternative to ease the right ventricle overload and to restore the systemic left ventricular function. However, this is a prolonged operation and not without risks and long-term complications. In this review the clinical, diagnostic, and therapeutic aspects are overviewed in the light of the most significant and recent literature.

  10. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  11. Paediatric models in motion: requirements for model-based decision support at the bedside.

    PubMed

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care.

  12. Paediatric models in motion: requirements for model-based decision support at the bedside.

    PubMed

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  13. Paediatric models in motion: requirements for model-based decision support at the bedside

    PubMed Central

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  14. [Nuclear transfer and therapeutic cloning].

    PubMed

    Xu, Xiao-Ming; Lei, An-Min; Hua, Jin-Lian; Dou, Zhong-Ying

    2005-03-01

    Nuclear transfer and therapeutic cloning have widespread and attractive prospects in animal agriculture and biomedical applications. We reviewed that the quality of oocytes and nuclear reprogramming of somatic donor cells were the main reasons of the common abnormalities in cloned animals and the low efficiency of cloning and showed the problems and outlets in therapeutic cloning, such as some basic problems in nuclear transfer affected clinical applications of therapeutic cloning. Study on isolation and culture of nuclear transfer embryonic stem (ntES) cells and specific differentiation of ntES cells into important functional cells should be emphasized and could enhance the efficiency. Adult stem cells could help to cure some great diseases, but could not replace therapeutic cloning. Ethics also impeded the development of therapeutic cloning. It is necessary to improve many techniques and reinforce the research of some basic theories, then somatic nuclear transfer and therapeutic cloning may apply to agriculture reproduction and benefit to human life better.

  15. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  16. Verification and Validation of Model-Based Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  17. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  18. Political Correctness and Cultural Studies.

    ERIC Educational Resources Information Center

    Carey, James W.

    1992-01-01

    Discusses political correctness and cultural studies, dealing with cultural studies and the left, the conservative assault on cultural studies, and political correctness in the university. Describes some of the underlying changes in the university, largely unaddressed in the political correctness debate, that provide the deep structure to the…

  19. Job Satisfaction in Correctional Officers.

    ERIC Educational Resources Information Center

    Diehl, Ron J.

    For more than a decade, correctional leaders throughout the country have attempted to come to grips with the basic issues involved in ascertaining and meeting the needs of correctional institutions. This study investigated job satisfaction in 122 correctional officers employed in both rural and urban prison locations for the State of Kansas…

  20. Yearbook of Correctional Education 1989.

    ERIC Educational Resources Information Center

    Duguid, Stephen, Ed.

    This yearbook contains conference papers, commissioned papers, reprints of earlier works, and research-in-progress. They offer a retrospective view as well as address the mission and perspective of correctional education, its international dimension, correctional education in action, and current research. Papers include "Correctional Education and…

  1. Model-Based Radiation Dose Correction for Yttrium-90 Microsphere Treatment of Liver Tumors With Central Necrosis

    SciTech Connect

    Liu, Ching-Sheng; Lin, Ko-Han; Lee, Rheun-Chuan; Tseng, Hsiou-Shan; Wang, Ling-Wei; Huang, Pin-I; Chao, Liung-Sheau; Chang, Cheng-Yen; Yen, Sang-Hue; Tung, Chuan-Jong; Wang, Syh-Jen; Oliver Wong, Ching-yee

    2011-11-01

    Purpose: The objectives of this study were to model and calculate the absorbed fraction {phi} of energy emitted from yttrium-90 ({sup 90}Y) microsphere treatment of necrotic liver tumors. Methods and Materials: The tumor necrosis model was proposed for the calculation of {phi} over the spherical shell region. Two approaches, the semianalytic method and the probabilistic method, were adopted. In the former method, the range--energy relationship and the sampling of electron paths were applied to calculate the energy deposition within the target region, using the straight-ahead and continuous-slowing-down approximation (CSDA) method. In the latter method, the Monte Carlo PENELOPE code was used to verify results from the first method. Results: The fraction of energy, {phi}, absorbed from {sup 90}Y by 1-cm thickness of tumor shell from microsphere distribution by CSDA with complete beta spectrum was 0.832 {+-} 0.001 and 0.833 {+-} 0.001 for smaller (r{sub T} = 5 cm) and larger (r{sub T} = 10 cm) tumors (where r is the radii of the tumor [T] and necrosis [N]). The fraction absorbed depended mainly on the thickness of the tumor necrosis configuration, rather than on tumor necrosis size. The maximal absorbed fraction {phi} that occurred in tumors without central necrosis for each size of tumor was different: 0.950 {+-} 0.000, and 0.975 {+-} 0.000 for smaller (r{sub T} = 5 cm) and larger (r{sub T} = 10 cm) tumors, respectively (p < 0.0001). Conclusions: The tumor necrosis model was developed for dose calculation of {sup 90}Y microsphere treatment of hepatic tumors with central necrosis. With this model, important information is provided regarding the absorbed fraction applicable to clinical {sup 90}Y microsphere treatment.

  2. Therapeutic cloning in the mouse.

    PubMed

    Mombaerts, Peter

    2003-09-30

    Nuclear transfer technology can be applied to produce autologous differentiated cells for therapeutic purposes, a concept termed therapeutic cloning. Countless articles have been published on the ethics and politics of human therapeutic cloning, reflecting the high expectations from this new opportunity for rejuvenation of the aging or diseased body. Yet the research literature on therapeutic cloning, strictly speaking, is comprised of only four articles, all in the mouse. The efficiency of derivation of embryonic stem cell lines via nuclear transfer is remarkably consistent among these reports. However, the efficiency is so low that, in its present form, the concept is unlikely to become widespread in clinical practice.

  3. Therapeutic cloning: The ethical limits

    SciTech Connect

    Whittaker, Peter A. . E-mail: p.whittaker@lancaster.ac.uk

    2005-09-01

    A brief outline of stem cells, stem cell therapy and therapeutic cloning is given. The position of therapeutic cloning with regard to other embryonic manipulations - IVF-based reproduction, embryonic stem formation from IVF embryos and reproductive cloning - is indicated. The main ethically challenging stages in therapeutic cloning are considered to be the nuclear transfer process including the source of eggs for this and the destruction of an embryo to provide stem cells for therapeutic use. The extremely polarised nature of the debate regarding the status of an early human embryo is noted, and some potential alternative strategies for preparing immunocompatible pluripotent stem cells are indicated.

  4. Clinical applications of therapeutic phlebotomy

    PubMed Central

    Kim, Kyung Hee; Oh, Ki Young

    2016-01-01

    Phlebotomy is the removal of blood from the body, and therapeutic phlebotomy is the preferred treatment for blood disorders in which the removal of red blood cells or serum iron is the most efficient method for managing the symptoms and complications. Therapeutic phlebotomy is currently indicated for the treatment of hemochromatosis, polycythemia vera, porphyria cutanea tarda, sickle cell disease, and nonalcoholic fatty liver disease with hyperferritinemia. This review discusses therapeutic phlebotomy and the related disorders and also offers guidelines for establishing a therapeutic phlebotomy program. PMID:27486346

  5. EDITORIAL: Politically correct physics?

    NASA Astrophysics Data System (ADS)

    Pople Deputy Editor, Stephen

    1997-03-01

    If you were a caring, thinking, liberally minded person in the 1960s, you marched against the bomb, against the Vietnam war, and for civil rights. By the 1980s, your voice was raised about the destruction of the rainforests and the threat to our whole planetary environment. At the same time, you opposed discrimination against any group because of race, sex or sexual orientation. You reasoned that people who spoke or acted in a discriminatory manner should be discriminated against. In other words, you became politically correct. Despite its oft-quoted excesses, the political correctness movement sprang from well-founded concerns about injustices in our society. So, on balance, I am all for it. Or, at least, I was until it started to invade science. Biologists were the first to feel the impact. No longer could they refer to 'higher' and 'lower' orders, or 'primitive' forms of life. To the list of undesirable 'isms' - sexism, racism, ageism - had been added a new one: speciesism. Chemists remained immune to the PC invasion, but what else could you expect from a group of people so steeped in tradition that their principal unit, the mole, requires the use of the thoroughly unreconstructed gram? Now it is the turn of the physicists. This time, the offenders are not those who talk disparagingly about other people or animals, but those who refer to 'forms of energy' and 'heat'. Political correctness has evolved into physical correctness. I was always rather fond of the various forms of energy: potential, kinetic, chemical, electrical, sound and so on. My students might merge heat and internal energy into a single, fuzzy concept loosely associated with moving molecules. They might be a little confused at a whole new crop of energies - hydroelectric, solar, wind, geothermal and tidal - but they could tell me what devices turned chemical energy into electrical energy, even if they couldn't quite appreciate that turning tidal energy into geothermal energy wasn't part of the

  6. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  7. Electronic measurement correction devices

    SciTech Connect

    Mahns, R.R.

    1984-04-01

    The electronics semi-conductor revolution has touched every industry and home in the nation. The gas industry is no exception. Sophisticated gas measurement instrumentation has been with us for several decades now, but only in the last 10 years or so has it really begun to boom. First marketed were the flow computers dedicated to orifice meter measurement; but with steadily decreasing manufacturing costs, electronic instrumentation is now moving into the area of base volume, pressure and temperature correction previously handled almost solely by mechanical integrating instruments. This paper takes a brief look at some of the features of the newcomers on the market and how they stack up against the old standby mechanical base volume/pressure/temperature correctors.

  8. Phytonutrients as therapeutic agents.

    PubMed

    Gupta, Charu; Prakash, Dhan

    2014-09-01

    Nutrients present in various foods plays an important role in maintaining the normal functions of the human body. The major nutrients present in foods include carbohydrates, proteins, lipids, vitamins, and minerals. Besides these, there are some bioactive food components known as "phytonutrients" that play an important role in human health. They have tremendous impact on the health care system and may provide medical health benefits including the prevention and/or treatment of disease and various physiological disorders. Phytonutrients play a positive role by maintaining and modulating immune function to prevent specific diseases. Being natural products, they hold a great promise in clinical therapy as they possess no side effects that are usually associated with chemotherapy or radiotherapy. They are also comparatively cheap and thus significantly reduce health care cost. Phytonutrients are the plant nutrients with specific biological activities that support human health. Some of the important bioactive phytonutrients include polyphenols, terpenoids, resveratrol, flavonoids, isoflavonoids, carotenoids, limonoids, glucosinolates, phytoestrogens, phytosterols, anthocyanins, ω-3 fatty acids, and probiotics. They play specific pharmacological effects in human health such as anti-microbial, anti-oxidants, anti-inflammatory, antiallergic, anti-spasmodic, anti-cancer, anti-aging, hepatoprotective, hypolipidemic, neuroprotective, hypotensive, diabetes, osteoporosis, CNS stimulant, analgesic, protection from UVB-induced carcinogenesis, immuno-modulator, and carminative. This mini-review attempts to summarize the major important types of phytonutrients and their role in promoting human health and as therapeutic agents along with the current market trend and commercialization.

  9. Plasmids encoding therapeutic agents

    DOEpatents

    Keener, William K.

    2007-08-07

    Plasmids encoding anti-HIV and anti-anthrax therapeutic agents are disclosed. Plasmid pWKK-500 encodes a fusion protein containing DP178 as a targeting moiety, the ricin A chain, an HIV protease cleavable linker, and a truncated ricin B chain. N-terminal extensions of the fusion protein include the maltose binding protein and a Factor Xa protease site. C-terminal extensions include a hydrophobic linker, an L domain motif peptide, a KDEL ER retention signal, another Factor Xa protease site, an out-of-frame buforin II coding sequence, the lacZ.alpha. peptide, and a polyhistidine tag. More than twenty derivatives of plasmid pWKK-500 are described. Plasmids pWKK-700 and pWKK-800 are similar to pWKK-500 wherein the DP178-encoding sequence is substituted by RANTES- and SDF-1-encoding sequences, respectively. Plasmid pWKK-900 is similar to pWKK-500 wherein the HIV protease cleavable linker is substituted by a lethal factor (LF) peptide-cleavable linker.

  10. Leech Therapeutic Applications

    PubMed Central

    Abdualkader, A. M.; Ghawi, A. M.; Alaama, M.; Awang, M.; Merzouk, A.

    2013-01-01

    Hematophagous animals including leeches have been known to possess biologically active compounds in their secretions, especially in their saliva. The blood-sucking annelids, leeches have been used for therapeutic purposes since the beginning of civilization. Ancient Egyptian, Indian, Greek and Arab physicians used leeches for a wide range of diseases starting from the conventional use for bleeding to systemic ailments, such as skin diseases, nervous system abnormalities, urinary and reproductive system problems, inflammation, and dental problems. Recently, extensive researches on leech saliva unveiled the presence of a variety of bioactive peptides and proteins involving antithrombin (hirudin, bufrudin), antiplatelet (calin, saratin), factor Xa inhibitors (lefaxin), antibacterial (theromacin, theromyzin) and others. Consequently, leech has made a comeback as a new remedy for many chronic and life-threatening abnormalities, such as cardiovascular problems, cancer, metastasis, and infectious diseases. In the 20th century, leech therapy has established itself in plastic and microsurgery as a protective tool against venous congestion and served to salvage the replanted digits and flaps. Many clinics for plastic surgery all over the world started to use leeches for cosmetic purposes. Despite the efficacious properties of leech therapy, the safety, and complications of leeching are still controversial. PMID:24019559

  11. Therapeutic use of calcimimetics.

    PubMed

    Hebert, Steven C

    2006-01-01

    It has long been recognized that the secretion of PTH by chief cells in the parathyroid gland is regulated by extracellular ionized calcium. The molecular mechanism by which extracellular Ca2+ performs this feat was deduced by the cloning of the extracellular calcium-sensing receptor (CaSR) in 1993 in the laboratories of Brown and Hebert. The CaSR is a G protein-coupled cell surface receptor that belongs to family 3 of the GPCR superfamily. The CaSR senses the extracellular ionic activity of the divalent minerals Ca2+ and Mg2+ and translates this information, via a complex array of cellular signaling pathways, to modify cell and tissue function. Genetic studies have demonstrated that the activity of this receptor determines the steady-state plasma calcium concentration in humans by regulating key elements in the calcium homeostatic system. CaSR agonists (calcimimetics) and antagonists (calcilytics) have been identified and have provided both current and potential therapies for a variety of disorders. Calcimimetics can effectively reduce PTH secretion in all forms of hyperparathyroidism. They are likely to become a major therapy for secondary hyperparathyroidism associated with renal failure and for treatment of certain patients with primary hyperparathyroidism. On the therapeutic horizon are calcilytics that can transiently increase PTH and may prove useful in the treatment of osteoporosis. PMID:16409154

  12. Rethinking political correctness.

    PubMed

    Ely, Robin J; Meyerson, Debra E; Davidson, Martin N

    2006-09-01

    Legal and cultural changes over the past 40 years ushered unprecedented numbers of women and people of color into companies' professional ranks. Laws now protect these traditionally underrepresented groups from blatant forms of discrimination in hiring and promotion. Meanwhile, political correctness has reset the standards for civility and respect in people's day-to-day interactions. Despite this obvious progress, the authors' research has shown that political correctness is a double-edged sword. While it has helped many employees feel unlimited by their race, gender, or religion,the PC rule book can hinder people's ability to develop effective relationships across race, gender, and religious lines. Companies need to equip workers with skills--not rules--for building these relationships. The authors offer the following five principles for healthy resolution of the tensions that commonly arise over difference: Pause to short-circuit the emotion and reflect; connect with others, affirming the importance of relationships; question yourself to identify blind spots and discover what makes you defensive; get genuine support that helps you gain a broader perspective; and shift your mind-set from one that says, "You need to change," to one that asks, "What can I change?" When people treat their cultural differences--and related conflicts and tensions--as opportunities to gain a more accurate view of themselves, one another, and the situation, trust builds and relationships become stronger. Leaders should put aside the PC rule book and instead model and encourage risk taking in the service of building the organization's relational capacity. The benefits will reverberate through every dimension of the company's work.

  13. Duration Model-Based Post-processing for the Performance Improvement of a Keyword Spotting System

    NASA Astrophysics Data System (ADS)

    Lee, Min Ji; Yoon, Jae Sam; Oh, Yoo Rhee; Kim, Hong Kook; Choi, Song Ha; Kim, Ji Woon; Kim, Myeong Bo

    In this paper, we propose a post-processing method based on a duration model to improve the performance of a keyword spotting system. The proposed duration model-based post-processing method is performed after detecting a keyword. To detect the keyword, we first combine a keyword model, a non-keyword model, and a silence model. Using the information on the detected keyword, the proposed post-processing method is then applied to determine whether or not the correct keyword is detected. To this end, we generate the duration model using Gaussian distribution in order to accommodate different duration characteristics of each phoneme. Comparing the performance of the proposed method with those of conventional anti-keyword scoring methods, it is shown that the false acceptance and the false rejection rates are reduced.

  14. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil

    2012-01-01

    Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.

  15. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  16. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    SciTech Connect

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-15

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated.

  17. Model-based choices involve prospective neural activity

    PubMed Central

    Doll, Bradley B.; Duncan, Katherine D.; Simon, Dylan A.; Shohamy, Daphna; Daw, Nathaniel D.

    2015-01-01

    Decisions may arise via “model-free” repetition of previously reinforced actions, or by “model-based” evaluation, which is widely thought to follow from prospective anticipation of action consequences using a learned map or model. While choices and neural correlates of decision variables sometimes reflect knowledge of their consequences, it remains unclear whether this actually arises from prospective evaluation. Using functional MRI and a sequential reward-learning task in which paths contained decodable object categories, we found that humans’ model-based choices were associated with neural signatures of future paths observed at decision time, suggesting a prospective mechanism for choice. Prospection also covaried with the degree of model-based influences on neural correlates of decision variables, and was inversely related to prediction error signals thought to underlie model-free learning. These results dissociate separate mechanisms underlying model-based and model-free evaluation and support the hypothesis that model-based influences on choices and neural decision variables result from prospection. PMID:25799041

  18. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  19. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    SciTech Connect

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  20. Data Entry Errors and Design for Model-Based Tight Glycemic Control in Critical Care

    PubMed Central

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Introduction Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. Method To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. Results The final data entry method tested reduced errors to less than 1–2%, a 60–80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0–20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. Conclusions The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. PMID:22401331

  1. Model-based coding of facial images based on facial muscle motion through isodensity maps

    NASA Astrophysics Data System (ADS)

    So, Ikken; Nakamura, Osamu; Minami, Toshi

    1991-11-01

    A model-based coding system has come under serious consideration for the next generation of image coding schemes, aimed at greater efficiency in TV telephone and TV conference systems. In this model-based coding system, the sender's model image is transmitted and stored at the receiving side before the start of the conversation. During the conversation, feature points are extracted from the facial image of the sender and are transmitted to the receiver. The facial expression of the sender facial is reconstructed from the feature points received and a wireframed model constructed at the receiving side. However, the conventional methods have the following problems: (1) Extreme changes of the gray level, such as in wrinkles caused by change of expression, cannot be reconstructed at the receiving side. (2) Extraction of stable feature points from facial images with irregular features such as spectacles or facial hair is very difficult. To cope with the first problem, a new algorithm based on isodensity lines which can represent detailed changes in expression by density correction has already been proposed and good results obtained. As for the second problem, we propose in this paper a new algorithm to reconstruct facial images by transmitting other feature points extracted from isodensity maps.

  2. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model based accelerator control started at SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical change with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed these application programs for a fourth time. This time, however, the programs we are developing are generic so that we will not have to do it again. We have developed an integrated system called GOLD (Generic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  3. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model-based accelerator control started at SPEAR. Since that time nearly all accelerator beamlines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP and SLC all use different control programs. Since many of these application programs are embedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed an integrated system called GOLD (Genetic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  4. Adaptive model-based control systems and methods for controlling a gas turbine

    NASA Technical Reports Server (NTRS)

    Brunell, Brent Jerome (Inventor); Mathews, Jr., Harry Kirk (Inventor); Kumar, Aditya (Inventor)

    2004-01-01

    Adaptive model-based control systems and methods are described so that performance and/or operability of a gas turbine in an aircraft engine, power plant, marine propulsion, or industrial application can be optimized under normal, deteriorated, faulted, failed and/or damaged operation. First, a model of each relevant system or component is created, and the model is adapted to the engine. Then, if/when deterioration, a fault, a failure or some kind of damage to an engine component or system is detected, that information is input to the model-based control as changes to the model, constraints, objective function, or other control parameters. With all the information about the engine condition, and state and directives on the control goals in terms of an objective function and constraints, the control then solves an optimization so the optimal control action can be determined and taken. This model and control may be updated in real-time to account for engine-to-engine variation, deterioration, damage, faults and/or failures using optimal corrective control action command(s).

  5. Model-based Roentgen stereophotogrammetry of orthopaedic implants.

    PubMed

    Valstar, E R; de Jong, F W; Vrooman, H A; Rozing, P M; Reiber, J H

    2001-06-01

    Attaching tantalum markers to prostheses for Roentgen stereophotogrammetry (RSA) may be difficult and is sometimes even impossible. In this study, a model-based RSA method that avoids the attachment of markers to prostheses is presented and validated. This model-based RSA method uses a triangulated surface model of the implant. A projected contour of this model is calculated and this calculated model contour is matched onto the detected contour of the actual implant in the RSA radiograph. The difference between the two contours is minimized by variation of the position and orientation of the model. When a minimal difference between the contours is found, an optimal position and orientation of the model has been obtained. The method was validated by means of a phantom experiment. Three prosthesis components were used in this experiment: the femoral and tibial component of an Interax total knee prosthesis (Stryker Howmedica Osteonics Corp., Rutherfort, USA) and the femoral component of a Profix total knee prosthesis (Smith & Nephew, Memphis, USA). For the prosthesis components used in this study, the accuracy of the model-based method is lower than the accuracy of traditional RSA. For the Interax femoral and tibial components, significant dimensional tolerances were found that were probably caused by the casting process and manual polishing of the components surfaces. The largest standard deviation for any translation was 0.19mm and for any rotation it was 0.52 degrees. For the Profix femoral component that had no large dimensional tolerances, the largest standard deviation for any translation was 0.22mm and for any rotation it was 0.22 degrees. From this study we may conclude that the accuracy of the current model-based RSA method is sensitive to dimensional tolerances of the implant. Research is now being conducted to make model-based RSA less sensitive to dimensional tolerances and thereby improving its accuracy. PMID:11470108

  6. When Does Model-Based Control Pay Off?

    PubMed Central

    2016-01-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to “model-free” and “model-based” strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  7. Therapeutics in Huntington's Disease.

    PubMed

    Killoran, Annie; Biglan, Kevin M

    2012-02-01

    OPINION STATEMENT: There is no specific treatment for Huntington's disease (HD). Its many symptoms of motor, psychiatric, and cognitive deterioration are managed with symptomatic relief, rehabilitation, and support. The only drug approved by the US Food and Drug Administration (FDA) for the treatment of HD is an antichoreic agent, tetrabenazine, but this drug is used sparingly because of uneasiness regarding its propensity to cause depression and suicidality in this population, which is already at risk for these complications. Neuroleptics are still first-line treatments for chorea accompanied by comorbid depression and/or behavioral or psychotic symptoms, as is often the case. Psychiatric features, which have a significant impact on a patient's professional and personal life, often become the major focus of management. In addition to neuroleptics, commonly used medications include antidepressants, mood stabilizers, anxiolytics, and psychostimulants. In contrast, few treatment options are available for cognitive impairment in HD; this remains an important and largely unmet therapeutic need. HD patients typically lack insight into their disease manifestations, failing to recognize their need for treatment, and possibly even arguing against it. Multipurpose medications are employed advantageously to simplify the medication regimen, so as to facilitate compliance and not overwhelm the patient. For example, haloperidol can be prescribed for a patient with chorea, agitation, and anorexia, rather than targeting each symptom with a different drug. This approach also limits the potential for adverse effects, which can be difficult to distinguish from the features of the disease itself. With HD's complexity, it is best managed with a multidisciplinary approach that includes a movement disorders specialist, a genetic counselor, a mental health professional, a physical therapist, and a social worker for support and coordination of services. As the disease progresses, there

  8. Therapeutic Devices for Epilepsy

    PubMed Central

    Fisher, Robert S.

    2011-01-01

    Therapeutic devices provide new options for treating drug-resistant epilepsy. These devices act by a variety of mechanisms to modulate neuronal activity. Only vagus nerve stimulation, which continues to develop new technology, is approved for use in the United States. Deep brain stimulation (DBS) of anterior thalamus for partial epilepsy recently was approved in Europe and several other countries. Responsive neurostimulation, which delivers stimuli to one or two seizure foci in response to a detected seizure, recently completed a successful multicenter trial. Several other trials of brain stimulation are in planning or underway. Transcutaneous magnetic stimulation (TMS) may provide a noninvasive method to stimulate cortex. Controlled studies of TMS split on efficacy, and may depend on whether a seizure focus is near a possible region for stimulation. Seizure detection devices in the form of “shake” detectors via portable accelerometers can provide notification of an ongoing tonic-clonic seizure, or peace of mind in the absence of notification. Prediction of seizures from various aspects of EEG is in early stages. Prediction appears to be possible in a subpopulation of people with refractory seizures and a clinical trial of an implantable prediction device is underway. Cooling of neocortex or hippocampus reversibly can attenuate epileptiform EEG activity and seizures, but engineering problems remain in its implementation. Optogenetics is a new technique that can control excitability of specific populations of neurons with light. Inhibition of epileptiform activity has been demonstrated in hippocampal slices, but use in humans will require more work. In general, devices provide useful palliation for otherwise uncontrollable seizures, but with a different risk profile than with most drugs. Optimizing the place of devices in therapy for epilepsy will require further development and clinical experience. PMID:22367987

  9. [Our experience with the correction of large protruding ears].

    PubMed

    Ézrokhin, V M; Nikitin, A A; Bezdenezhnykh, D S; Givirovskaia, N E

    2012-01-01

    The objective of the present work was to improve the results of correction of large protruding auricles. The results of long-term observations and many-year experience of the authors provided the basis for the development of the method that allows to simultaneously diminish the size of a protruding ear and correct an auricular defect. A detailed description of selected steps of the proposed approach is presented together with relevant illustrations. The follow-up analysis of the outcomes of the treatment gives evidence of its high therapeutic efficiency.

  10. The Evolution of Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Riley, Bob; Skalko, Thomas K.

    1998-01-01

    Reviews elements that impact the delivery of therapeutic recreation services, emphasizing elements that are external to the discipline and influence practice and elements that are internal to the discipline and must be addressed if therapeutic recreation is to continue its evolution as a competitive health and human service discipline.…

  11. Therapeutic touch coming of age.

    PubMed

    Straneva, J A

    2000-04-01

    Therapeutic Touch, a meditative healing practice created by Dolores Krieger and Dora Kunz, is adapted from the "laying on of hands" for the purpose of helping or healing others. The history of the technique and its influence on the health care system are chronicled in an effort to establish the role Therapeutic Touch has played in transforming people's lives. PMID:12119623

  12. [Lithiasis and ectopic pelvic kidney. Therapeutic aspects].

    PubMed

    Aboutaieb, R; Rabii, R; el Moussaoui, A; Joual, A; Sarf, I; el Mrini, M; Benjelloun, S

    1996-01-01

    Kidney in ectopic position is dysplasic, and associated to other malformations. The advent of a lithiasis in these conditions rises questions about therapeutic options. We report on five observations of pelvic ectopic kidney with urinary lithiasis. Patients were aged from 16 to 42 years. Kidney was non functional in two cases, or with normal appearance sized 10 to 12 cm. We performed total nephrectomy in two cases, pyelolithotomy in the other cases. Surgical approach was subperitoneal via iliac route. A dismembered pyeloplasty was associated in one case. All patients did well. Radiologic control at 6 and 12 months showed no recurrence in a well functioning kidney. Surgical lithotomy is advocated as a treatment in urinary lithiasis affecting ectopic kidney. It is an easy procedure which permits correction of other associated malformations.

  13. When not to trust therapeutic drug monitoring

    PubMed Central

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-01-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  14. When not to trust therapeutic drug monitoring

    PubMed Central

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-01-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided.

  15. When not to trust therapeutic drug monitoring.

    PubMed

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-09-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  16. Bacteriophage Procurement for Therapeutic Purposes.

    PubMed

    Weber-Dąbrowska, Beata; Jończyk-Matysiak, Ewa; Żaczek, Maciej; Łobocka, Małgorzata; Łusiak-Szelachowska, Marzanna; Górski, Andrzej

    2016-01-01

    Bacteriophages (phages), discovered 100 years ago, are able to infect and destroy only bacterial cells. In the current crisis of antibiotic efficacy, phage therapy is considered as a supplementary or even alternative therapeutic approach. Evolution of multidrug-resistant and pandrug-resistant bacterial strains poses a real threat, so it is extremely important to have the possibility to isolate new phages for therapeutic purposes. Our phage laboratory and therapy center has extensive experience with phage isolation, characterization, and therapeutic application. In this article we present current progress in bacteriophages isolation and use for therapeutic purposes, our experience in this field and its practical implications for phage therapy. We attempt to summarize the state of the art: properties of phages, the methods for their isolation, criteria of phage selection for therapeutic purposes and limitations of their use. Perspectives for the use of genetically engineered phages to specifically target bacterial virulence-associated genes are also briefly presented. PMID:27570518

  17. Metrics for antibody therapeutics development.

    PubMed

    Reichert, Janice M

    2010-01-01

    A wide variety of full-size monoclonal antibodies (mAbs) and therapeutics derived from alternative antibody formats can be produced through genetic and biological engineering techniques. These molecules are now filling the preclinical and clinical pipelines of every major pharmaceutical company and many biotechnology firms. Metrics for the development of antibody therapeutics, including averages for the number of candidates entering clinical study and development phase lengths for mAbs approved in the United States, were derived from analysis of a dataset of over 600 therapeutic mAbs that entered clinical study sponsored, at least in part, by commercial firms. The results presented provide an overview of the field and context for the evaluation of on-going and prospective mAb development programs. The expansion of therapeutic antibody use through supplemental marketing approvals and the increase in the study of therapeutics derived from alternative antibody formats are discussed.

  18. Transdermal delivery of therapeutic agent

    NASA Technical Reports Server (NTRS)

    Kwiatkowski, Krzysztof C. (Inventor); Hayes, Ryan T. (Inventor); Magnuson, James W. (Inventor); Giletto, Anthony (Inventor)

    2008-01-01

    A device for the transdermal delivery of a therapeutic agent to a biological subject that includes a first electrode comprising a first array of electrically conductive microprojections for providing electrical communication through a skin portion of the subject to a second electrode comprising a second array of electrically conductive microprojections. Additionally, a reservoir for holding the therapeutic agent surrounding the first electrode and a pulse generator for providing an exponential decay pulse between the first and second electrodes may be provided. A method includes the steps of piercing a stratum corneum layer of skin with two arrays of conductive microprojections, encapsulating the therapeutic agent into biocompatible charged carriers, surrounding the conductive microprojections with the therapeutic agent, generating an exponential decay pulse between the two arrays of conductive microprojections to create a non-uniform electrical field and electrokinetically driving the therapeutic agent through the stratum corneum layer of skin.

  19. Bacteriophage Procurement for Therapeutic Purposes

    PubMed Central

    Weber-Dąbrowska, Beata; Jończyk-Matysiak, Ewa; Żaczek, Maciej; Łobocka, Małgorzata; Łusiak-Szelachowska, Marzanna; Górski, Andrzej

    2016-01-01

    Bacteriophages (phages), discovered 100 years ago, are able to infect and destroy only bacterial cells. In the current crisis of antibiotic efficacy, phage therapy is considered as a supplementary or even alternative therapeutic approach. Evolution of multidrug-resistant and pandrug-resistant bacterial strains poses a real threat, so it is extremely important to have the possibility to isolate new phages for therapeutic purposes. Our phage laboratory and therapy center has extensive experience with phage isolation, characterization, and therapeutic application. In this article we present current progress in bacteriophages isolation and use for therapeutic purposes, our experience in this field and its practical implications for phage therapy. We attempt to summarize the state of the art: properties of phages, the methods for their isolation, criteria of phage selection for therapeutic purposes and limitations of their use. Perspectives for the use of genetically engineered phages to specifically target bacterial virulence-associated genes are also briefly presented. PMID:27570518

  20. Therapeutic cloning: promises and issues

    PubMed Central

    Kfoury, Charlotte

    2007-01-01

    Advances in biotechnology necessitate both an understanding of scientific principles and ethical implications to be clinically applicable in medicine. In this regard, therapeutic cloning offers significant potential in regenerative medicine by circumventing immunorejection, and in the cure of genetic disorders when used in conjunction with gene therapy. Therapeutic cloning in the context of cell replacement therapy holds a huge potential for de novo organogenesis and the permanent treatment of Parkinson’s disease, Duchenne muscular dystrophy, and diabetes mellitus as shown by in vivo studies. Scientific roadblocks impeding advancement in therapeutic cloning are tumorigenicity, epigenetic reprogramming, mitochondrial heteroplasmy, interspecies pathogen transfer, low oocyte availability. Therapeutic cloning is also often tied to ethical considerations concerning the source, destruction and moral status of IVF embryos based on the argument of potential. Legislative and funding issues are also addressed. Future considerations would include a distinction between therapeutic and reproductive cloning in legislative formulations. PMID:18523539

  1. Thermodynamics of Error Correction

    NASA Astrophysics Data System (ADS)

    Sartori, Pablo; Pigolotti, Simone

    2015-10-01

    Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  2. Chaperones as potential therapeutics for Krabbe disease.

    PubMed

    Graziano, Adriana Carol Eleonora; Pannuzzo, Giovanna; Avola, Rosanna; Cardile, Venera

    2016-11-01

    Krabbe's disease (KD) is an autosomal recessive, neurodegenerative disorder. It is classified among the lysosomal storage diseases (LSDs). It was first described in , but the genetic defect for the galactocerebrosidase (GALC) gene was not discovered until the beginning of the 1970s, 20 years before the GALC cloning. Recently, in 2011, the crystal structures of the GALC enzyme and the GALC-product complex were obtained. For this, compared with other LSDs, the research on possible therapeutic interventions is much more recent. Thus, it is not surprising that some treatment options are still under preclinical investigation, whereas their relevance for other pathologies of the same group has already been tested in clinical studies. This is specifically the case for pharmacological chaperone therapy (PCT), a promising strategy for selectively correcting defective protein folding and trafficking and for enhancing enzyme activity by small molecules. These compounds bind directly to a partially folded biosynthetic intermediate, stabilize the protein, and allow completion of the folding process to yield a functional protein. Here, we review the chaperones that have demonstrated potential therapeutics during preclinical studies for KD, underscoring the requirement to invigorate research for KD-addressed PCT that will benefit from recent insights into the molecular understanding of GALC structure, drug design, and development in cellular models. © 2016 Wiley Periodicals, Inc. PMID:27638605

  3. Development of Novel Activin-Targeted Therapeutics

    PubMed Central

    Chen, Justin L; Walton, Kelly L; Al-Musawi, Sara L; Kelly, Emily K; Qian, Hongwei; La, Mylinh; Lu, Louis; Lovrecz, George; Ziemann, Mark; Lazarus, Ross; El-Osta, Assam; Gregorevic, Paul; Harrison, Craig A

    2015-01-01

    Soluble activin type II receptors (ActRIIA/ActRIIB), via binding to diverse TGF-β proteins, can increase muscle and bone mass, correct anemia or protect against diet-induced obesity. While exciting, these multiple actions of soluble ActRIIA/IIB limit their therapeutic potential and highlight the need for new reagents that target specific ActRIIA/IIB ligands. Here, we modified the activin A and activin B prodomains, regions required for mature growth factor synthesis, to generate specific activin antagonists. Initially, the prodomains were fused to the Fc region of mouse IgG2A antibody and, subsequently, “fastener” residues (Lys45, Tyr96, His97, and Ala98; activin A numbering) that confer latency to other TGF-β proteins were incorporated. For the activin A prodomain, these modifications generated a reagent that potently (IC50 5 nmol/l) and specifically inhibited activin A signaling in vitro, and activin A-induced muscle wasting in vivo. Interestingly, the modified activin B prodomain inhibited both activin A and B signaling in vitro (IC50 ~2 nmol/l) and in vivo, suggesting it could serve as a general activin antagonist. Importantly, unlike soluble ActRIIA/IIB, the modified prodomains did not inhibit myostatin or GDF-11 activity. To underscore the therapeutic utility of specifically antagonising activin signaling, we demonstrate that the modified activin prodomains promote significant increases in muscle mass. PMID:25399825

  4. Therapeutic approaches for spinal cord injury

    PubMed Central

    Cristante, Alexandre Fogaça; de Barros Filho, Tarcísio Eloy Pessoa; Marcon, Raphael Martus; Letaif, Olavo Biraghi; da Rocha, Ivan Dias

    2012-01-01

    This study reviews the literature concerning possible therapeutic approaches for spinal cord injury. Spinal cord injury is a disabling and irreversible condition that has high economic and social costs. There are both primary and secondary mechanisms of damage to the spinal cord. The primary lesion is the mechanical injury itself. The secondary lesion results from one or more biochemical and cellular processes that are triggered by the primary lesion. The frustration of health professionals in treating a severe spinal cord injury was described in 1700 BC in an Egyptian surgical papyrus that was translated by Edwin Smith; the papyrus reported spinal fractures as a “disease that should not be treated.” Over the last two decades, several studies have been performed to obtain more effective treatments for spinal cord injury. Most of these studies approach a patient with acute spinal cord injury in one of four manners: corrective surgery or a physical, biological or pharmacological treatment method. Science is unraveling the mechanisms of cell protection and neuroregeneration, but clinically, we only provide supportive care for patients with spinal cord injuries. By combining these treatments, researchers attempt to enhance the functional recovery of patients with spinal cord injuries. Advances in the last decade have allowed us to encourage the development of experimental studies in the field of spinal cord regeneration. The combination of several therapeutic strategies should, at minimum, allow for partial functional recoveries for these patients, which could improve their quality of life. PMID:23070351

  5. Clinical application of therapeutic erythrocytapheresis (TEA).

    PubMed

    Valbonesi, M; Bruni, R

    2000-06-01

    Therapeutic erythrocytapheresis (TEA) has been used in different diseases such as polycythemia vera (PV), secondary erythrocytosis or hemochromatosis as a process of the less cumbersome but more expensive phlebotomy. TEA is preferred in emergency conditions such as thrombocytosis or in conditions such as porphyria cutanea tarda (PCT) or erythropoietic porphyria when plasma exchange (PEX) is often combined with TEA to reduce extracellular levels of uroporphyrin which contribute to plasma hyperviscosity. TEA is often combined with drug therapy that varies from etoposide in PV to EPO and desferoxamine which are used to mobilize and reduce iron stores in hemochromatosis. Benefits from this combination may be more long lasting than expected. Nonetheless for TEA, there is no standard protocol and, clinical experience with this therapy remains highly anecdotal. Therapeutic red cell-exchange (TREX) has been used with much interest over the years, starting with the management of hemolytic disease of the newborn and later used to correct severe anemia in thalassemia patients thereby preventing iron overload. It has also been used for the management of complications of sickle cell disease such as priapism, chest syndrome, stroke, retinal, bone, splenic and hepatic infarction or in preparation for surgery by reducing HbS to less than 30%. Automated apheresis has also favored the use of TREX in conditions such as paroxysmal nocturnal hemoglobinuria and aniline poisoning, arsenic poisoning, Na chlorate intoxications and CO intoxications, hemoglobinopathies, autoimmune hemolytic anemia, reactions due to ABO incompatibility, in preparation for ABO incompatible bone marrow transplantation or for preventing anti-D immunization after the transfusion of D(+) cells to D(-) recipients. Another field of application has been in the emergency management of intraerythrocytic parasite infections such as malaria and babesiosis. Application of TREX may be wide but its real use remains limited

  6. Image Guidance in Stem Cell Therapeutics: Unfolding the Blindfold.

    PubMed

    Bukhari, Amirali B; Dutta, Shruti; De, Abhijit

    2015-01-01

    Stem cell therapeutics is the future of regenerative medicine in the modern world. Many studies have been instigated with the hope of translating the outcome for the treatment of several disease conditions ranging from heart and neuronal disease to malignancies as grave as cancers. Stem cell therapeutics undoubtedly holds great promise on the front of regenerative medicine, however, the correct distribution and homing of these stem cells to the host site remained blinded until the recent advances in the discipline of molecular imaging. Herein, we discuss the various imaging guidance applied for determination of the proper delivery of various types of stem cell used as therapeutics for various maladies. Additionally, we scrutinize the use of several indirect labeling mechanisms for efficient tagging of the reporter entity for image guidance. Further, the promise of improving patient healthcare has led to the initiation of several clinical trials worldwide. However, in number of the cases, the benefits arrive with a price heavy enough to pose a serious health risk, one such being formation of teratomas. Thus numerous challenges and methodological obstacles must be overcome before their eloquent clinical impact can be realized. Therefore, we also discuss several clinical trials that have taken into consideration the various imaging guided protocols to monitor correct delivery and understand the distribution of therapeutic stem cells in real time.

  7. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  8. Outlier Identification in Model-Based Cluster Analysis

    PubMed Central

    Evans, Katie; Love, Tanzy; Thurston, Sally W.

    2015-01-01

    In model-based clustering based on normal-mixture models, a few outlying observations can influence the cluster structure and number. This paper develops a method to identify these, however it does not attempt to identify clusters amidst a large field of noisy observations. We identify outliers as those observations in a cluster with minimal membership proportion or for which the cluster-specific variance with and without the observation is very different. Results from a simulation study demonstrate the ability of our method to detect true outliers without falsely identifying many non-outliers and improved performance over other approaches, under most scenarios. We use the contributed R package MCLUST for model-based clustering, but propose a modified prior for the cluster-specific variance which avoids degeneracies in estimation procedures. We also compare results from our outlier method to published results on National Hockey League data. PMID:26806993

  9. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  10. Model-based inversion for a shallow ocean application

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-03-01

    A model-based approach to invert or estimate the sound speed profile (SSP) from noisy pressure-field measurements is discussed. The resulting model-based processor (MBP) is based on the state-space representation of the normal-mode propagation model. Using data obtained from the well-known Hudson Canyon experiment, a noisy shallow water ocean environment, the processor is designed and the results compared to those predicted using various propagation models and data. It is shown that the MBP not only predicts the sound speed quite well, but also is able to simultaneously provide enhanced estimates of both modal and pressure-field measurements which are useful for localization and rapid ocean environmental characterization.

  11. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  12. REAL-TIME MODEL-BASED ELECTRICAL POWERED WHEELCHAIR CONTROL

    PubMed Central

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G.; Ding, Dan; Cooper, Rory A.

    2009-01-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel-slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3-D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds. PMID:19733494

  13. 3-D model-based tracking for UAV indoor localization.

    PubMed

    Teulière, Céline; Marchand, Eric; Eck, Laurent

    2015-05-01

    This paper proposes a novel model-based tracking approach for 3-D localization. One main difficulty of standard model-based approach lies in the presence of low-level ambiguities between different edges. In this paper, given a 3-D model of the edges of the environment, we derive a multiple hypotheses tracker which retrieves the potential poses of the camera from the observations in the image. We also show how these candidate poses can be integrated into a particle filtering framework to guide the particle set toward the peaks of the distribution. Motivated by the UAV indoor localization problem where GPS signal is not available, we validate the algorithm on real image sequences from UAV flights.

  14. 3-D model-based tracking for UAV indoor localization.

    PubMed

    Teulière, Céline; Marchand, Eric; Eck, Laurent

    2015-05-01

    This paper proposes a novel model-based tracking approach for 3-D localization. One main difficulty of standard model-based approach lies in the presence of low-level ambiguities between different edges. In this paper, given a 3-D model of the edges of the environment, we derive a multiple hypotheses tracker which retrieves the potential poses of the camera from the observations in the image. We also show how these candidate poses can be integrated into a particle filtering framework to guide the particle set toward the peaks of the distribution. Motivated by the UAV indoor localization problem where GPS signal is not available, we validate the algorithm on real image sequences from UAV flights. PMID:25099967

  15. Model-based hierarchical reinforcement learning and human action control

    PubMed Central

    Botvinick, Matthew; Weinstein, Ari

    2014-01-01

    Recent work has reawakened interest in goal-directed or ‘model-based’ choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour. PMID:25267822

  16. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  17. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions.

  18. Model-based resolution: applying the theory in quantitative microscopy.

    PubMed

    Santos, A; Young, I T

    2000-06-10

    Model-based image processing techniques have been proposed as a way to increase the resolution of optical microscopes. Here a model based on the microscope's point-spread function is analyzed, and the resolution limits achieved with a proposed goodness-of-fit criterion are quantified. Several experiments were performed to evaluate the possibilities and limitations of this method: (a) experiments with an ideal (diffraction-limited) microscope, (b) experiments with simulated dots and a real microscope, and (c) experiments with real dots acquired with a real microscope. The results show that a threefold increase over classical resolution (e.g., Rayleigh) is possible. These results can be affected by model misspecifications, whereas model corruption, as seen in the effect of Poisson noise, seems to be unimportant. This research can be considered to be preliminary with the final goal being the accurate measurement of various cytogenetic properties, such as gene distributions, in labeled preparations.

  19. Hybrid and adaptive meta-model-based global optimization

    NASA Astrophysics Data System (ADS)

    Gu, J.; Li, G. Y.; Dong, Z.

    2012-01-01

    As an efficient and robust technique for global optimization, meta-model-based search methods have been increasingly used in solving complex and computation intensive design optimization problems. In this work, a hybrid and adaptive meta-model-based global optimization method that can automatically select appropriate meta-modelling techniques during the search process to improve search efficiency is introduced. The search initially applies three representative meta-models concurrently. Progress towards a better performing model is then introduced by selecting sample data points adaptively according to the calculated values of the three meta-models to improve modelling accuracy and search efficiency. To demonstrate the superior performance of the new algorithm over existing search methods, the new method is tested using various benchmark global optimization problems and applied to a real industrial design optimization example involving vehicle crash simulation. The method is particularly suitable for design problems involving computation intensive, black-box analyses and simulations.

  20. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. PMID:26521723

  1. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  2. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  3. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  4. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  5. In Situ Mosaic Brightness Correction

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.; Lorre, Jean J.

    2012-01-01

    In situ missions typically have pointable, mast-mounted cameras, which are capable of taking panoramic mosaics comprised of many individual frames. These frames are mosaicked together. While the mosaic software applies radiometric correction to the images, in many cases brightness/contrast seams still exist between frames. This is largely due to errors in the radiometric correction, and the absence of correction for photometric effects in the mosaic processing chain. The software analyzes the overlaps between adjacent frames in the mosaic and determines correction factors for each image in an attempt to reduce or eliminate these brightness seams.

  6. QCD corrections to triboson production

    NASA Astrophysics Data System (ADS)

    Lazopoulos, Achilleas; Melnikov, Kirill; Petriello, Frank

    2007-07-01

    We present a computation of the next-to-leading order QCD corrections to the production of three Z bosons at the Large Hadron Collider. We calculate these corrections using a completely numerical method that combines sector decomposition to extract infrared singularities with contour deformation of the Feynman parameter integrals to avoid internal loop thresholds. The NLO QCD corrections to pp→ZZZ are approximately 50% and are badly underestimated by the leading order scale dependence. However, the kinematic dependence of the corrections is minimal in phase space regions accessible at leading order.

  7. Entropic Corrections to Coulomb's Law

    NASA Astrophysics Data System (ADS)

    Hendi, S. H.; Sheykhi, A.

    2012-04-01

    Two well-known quantum corrections to the area law have been introduced in the literatures, namely, logarithmic and power-law corrections. Logarithmic corrections, arises from loop quantum gravity due to thermal equilibrium fluctuations and quantum fluctuations, while, power-law correction appears in dealing with the entanglement of quantum fields in and out the horizon. Inspired by Verlinde's argument on the entropic force, and assuming the quantum corrected relation for the entropy, we propose the entropic origin for the Coulomb's law in this note. Also we investigate the Uehling potential as a radiative correction to Coulomb potential in 1-loop order and show that for some value of distance the entropic corrections of the Coulomb's law is compatible with the vacuum-polarization correction in QED. So, we derive modified Coulomb's law as well as the entropy corrected Poisson's equation which governing the evolution of the scalar potential ϕ. Our study further supports the unification of gravity and electromagnetic interactions based on the holographic principle.

  8. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  9. Model based control of dynamic atomic force microscope

    SciTech Connect

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  10. Model based control of dynamic atomic force microscope.

    PubMed

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  11. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  12. Calculation of the ionization state for LTE plasmas using a new relativistic-screened hydrogenic model based on analytical potentials

    NASA Astrophysics Data System (ADS)

    Rubiano, J. G.; Rodríguez, R.; Gil, J. M.; Martel, P.; Mínguez, E.

    2002-01-01

    In this work, the Saha equation is solved using atomic data provided by means of a new relativistic-screened hydrogenic model based on analytical potentials to calculate the ionization state and ion abundance for LTE iron plasmas. The plasma effects on the atomic structure are taken into account by including the classical continuum lowering correction of Stewart and Pyatt. For high density, the Saha equation is modified to consider the degeneration of free electrons using the Fermi Dirac statistics instead of the Maxwellian distribution commonly used. The results are compared with more sophisticated self-consistent codes.

  13. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  14. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  15. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate. PMID:26995351

  16. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate.

  17. A Model-Based Expert System For Digital Systems Design

    NASA Astrophysics Data System (ADS)

    Wu, J. G.; Ho, W. P. C.; Hu, Y. H.; Yun, D. Y. Y.; Parng, T. M.

    1987-05-01

    In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.

  18. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  19. New orbit correction method uniting global and local orbit corrections

    NASA Astrophysics Data System (ADS)

    Nakamura, N.; Takaki, H.; Sakai, H.; Satoh, M.; Harada, K.; Kamiya, Y.

    2006-01-01

    A new orbit correction method, called the eigenvector method with constraints (EVC), is proposed and formulated to unite global and local orbit corrections for ring accelerators, especially synchrotron radiation(SR) sources. The EVC can exactly correct the beam positions at arbitrarily selected ring positions such as light source points, simultaneously reducing closed orbit distortion (COD) around the whole ring. Computer simulations clearly demonstrate these features of the EVC for both cases of the Super-SOR light source and the Advanced Light Source (ALS) that have typical structures of high-brilliance SR sources. In addition, the effects of errors in beam position monitor (BPM) reading and steering magnet setting on the orbit correction are analytically expressed and also compared with the computer simulations. Simulation results show that the EVC is very effective and useful for orbit correction and beam position stabilization in SR sources.

  20. The Earned-Time System: A Performance-Based Correctional Management Model.

    ERIC Educational Resources Information Center

    Nosin, Jerome Alan

    Utilizing a social learning approach the Georgia Department of Offender Rehabilitation has implemented a performance-based correctional management model based on the assumption that only self-rehabilitation is viable. Earned Time System (ETS) provides resources and motivational opportunities for inmates to assume personal responsibility for their…

  1. Using rule-based shot dose assignment in model-based MPC applications

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  2. How to Use Equipment Therapeutically.

    ERIC Educational Resources Information Center

    Bowne, Douglas

    1986-01-01

    Shares therapeutic and economic practices surrounding equipment used in New York's Higher Horizons adventure program of therapy for troubled youth. Encourages educators, therapists, and administrators to explore relationship between equipment selection, program goals, and clients. (NEC)

  3. Advances in Therapeutic Cancer Vaccines.

    PubMed

    Wong, Karrie K; Li, WeiWei Aileen; Mooney, David J; Dranoff, Glenn

    2016-01-01

    Therapeutic cancer vaccines aim to induce durable antitumor immunity that is capable of systemic protection against tumor recurrence or metastatic disease. Many approaches to therapeutic cancer vaccines have been explored, with varying levels of success. However, with the exception of Sipuleucel T, an ex vivo dendritic cell vaccine for prostate cancer, no therapeutic cancer vaccine has yet shown clinical efficacy in phase 3 randomized trials. Though disappointing, lessons learned from these studies have suggested new strategies to improve cancer vaccines. The clinical success of checkpoint blockade has underscored the role of peripheral tolerance mechanisms in limiting vaccine responses and highlighted the potential for combination therapies. Recent advances in transcriptome sequencing, computational modeling, and material engineering further suggest new opportunities to intensify cancer vaccines. This review will discuss the major approaches to therapeutic cancer vaccination and explore recent advances that inform the design of the next generation of cancer vaccines. PMID:26923002

  4. Targeted Strategies for Henipavirus Therapeutics

    PubMed Central

    Bossart, Katharine N; Bingham, John; Middleton, Deborah

    2007-01-01

    Hendra and Nipah viruses are related emergent paramyxoviruses that infect and cause disease in animals and humans. Disease manifests as a generalized vasculitis affecting multiple organs, but is the most severe in the respiratory and central nervous systems. The high case fatality and person-to-person transmission associated with the most recent NiV outbreaks, and the recent re-emergence of HeV, emphasize the importance and necessity of effective therapeutics for these novel agents. In recent years henipavirus research has revealed a more complete understanding of pathogenesis and, as a consequence, viable approaches towards vaccines and therapeutics have emerged. All strategies target early steps in viral replication including receptor binding and membrane fusion. Animal models have been developed, some of which may prove more valuable than others for evaluating the efficacy of therapeutic agents and regimes. Assessments of protective host immunity and drug pharmacokinetics will be crucial to the further advancement of therapeutic compounds. PMID:19440455

  5. [Therapeutic touch and anorexia nervosa].

    PubMed

    Satori, Nadine

    2016-01-01

    An innovative practice, therapeutic touch has been used for around ten years in the treatment of eating disorders. Delivered by nurse clinicians having received specific training, this approach is based on nursing diagnoses which identify the major symptoms of this pathology. The support is built around the body and its perceptions. Through the helping relationship, it mobilises the patient's resources to favour a relationship of trust, a letting-go, physical, psychological and emotional relaxation, and improves the therapeutic alliance. PMID:27615696

  6. [Pathological horseshoe kidney. Therapeutic aspects].

    PubMed

    Bennani, S; Touijer, A; Aboutaieb, R; el Mrini, M; Benjelloun, S

    1994-01-01

    The authors report the various therapeutic modalities of uropathies associated with horseshoe kidney, based on a series of 20 pathologic horseshoe kidneys, associated with 12 cases of renal stones, 5 ureteropelvic junction obstructions, 3 kidney tumors, 2 cases of pyonephrosis and finally 1 traumatic horseshoe kidney. The specific anatomic and surgical features of this uncommon malformation are emphasized and the therapeutic features of each uropathy associated with horseshoe kidney are discussed. PMID:7825982

  7. Therapeutic Vaccines for Chronic Infections

    NASA Astrophysics Data System (ADS)

    Autran, Brigitte; Carcelain, Guislaine; Combadiere, Béhazine; Debre, Patrice

    2004-07-01

    Therapeutic vaccines aim to prevent severe complications of a chronic infection by reinforcing host defenses when some immune control, albeit insufficient, can already be demonstrated and when a conventional antimicrobial therapy either is not available or has limited efficacy. We focus on the rationale and challenges behind this still controversial strategy and provide examples from three major chronic infectious diseases-human immunodeficiency virus, hepatitis B virus, and human papillomavirus-for which the efficacy of therapeutic vaccines is currently being evaluated.

  8. Diamagnetic Corrections and Pascal's Constants

    ERIC Educational Resources Information Center

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  9. Barometric and Earth Tide Correction

    SciTech Connect

    Toll, Nathaniel J.

    2005-11-10

    BETCO corrects for barometric and earth tide effects in long-term water level records. A regression deconvolution method is used ot solve a series of linear equations to determine an impulse response function for the well pressure head. Using the response function, a pressure head correction is calculated and applied.

  10. Atmospheric correction of satellite data

    NASA Astrophysics Data System (ADS)

    Shmirko, Konstantin; Bobrikov, Alexey; Pavlov, Andrey

    2015-11-01

    Atmosphere responses for more than 90% of all radiation measured by satellite. Due to this, atmospheric correction plays an important role in separating water leaving radiance from the signal, evaluating concentration of various water pigments (chlorophyll-A, DOM, CDOM, etc). The elimination of atmospheric intrinsic radiance from remote sensing signal referred to as atmospheric correction.

  11. Correcting Slightly Less Simple Movements

    ERIC Educational Resources Information Center

    Aivar, M. P.; Brenner, E.; Smeets, J. B. J.

    2005-01-01

    Many studies have analysed how goal directed movements are corrected in response to changes in the properties of the target. However, only simple movements to single targets have been used in those studies, so little is known about movement corrections under more complex situations. Evidence from studies that ask for movements to several targets…

  12. Fine-Tuning Corrective Feedback.

    ERIC Educational Resources Information Center

    Han, ZhaoHong

    2001-01-01

    Explores the notion of "fine-tuning" in connection with the corrective feedback process. Describes a longitudinal case study, conducted in the context of Norwegian as a second a language, that shows how fine-tuning and lack thereof in the provision of written corrective feedback differentially affects a second language learner's restructuring of…

  13. Elastic therapeutic tape: do they have the same material properties?

    PubMed Central

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-01-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex®, ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight® 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young’s modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young’s modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape. PMID:27190472

  14. Elastic therapeutic tape: do they have the same material properties?

    PubMed

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-04-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex(®), ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight(®) 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young's modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young's modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape.

  15. Multilevel and motion model-based ultrasonic speckle tracking algorithms.

    PubMed

    Yeung, F; Levinson, S F; Parker, K J

    1998-03-01

    A multilevel motion model-based approach to ultrasonic speckle tracking has been developed that addresses the inherent trade-offs associated with traditional single-level block matching (SLBM) methods. The multilevel block matching (MLBM) algorithm uses variable matching block and search window sizes in a coarse-to-fine scheme, preserving the relative immunity to noise associated with the use of a large matching block while preserving the motion field detail associated with the use of a small matching block. To decrease further the sensitivity of the multilevel approach to noise, speckle decorrelation and false matches, a smooth motion model-based block matching (SMBM) algorithm has been implemented that takes into account the spatial inertia of soft tissue elements. The new algorithms were compared to SLBM through a series of experiments involving manual translation of soft tissue phantoms, motion field computer simulations of rotation, compression and shear deformation, and an experiment involving contraction of human forearm muscles. Measures of tracking accuracy included mean squared tracking error, peak signal-to-noise ratio (PSNR) and blinded observations of optical flow. Measures of tracking efficiency included the number of sum squared difference calculations and the computation time. In the phantom translation experiments, the SMBM algorithm successfully matched the accuracy of SLBM using both large and small matching blocks while significantly reducing the number of computations and computation time when a large matching block was used. For the computer simulations, SMBM yielded better tracking accuracies and spatial resolution when compared with SLBM using a large matching block. For the muscle experiment, SMBM outperformed SLBM both in terms of PSNR and observations of optical flow. We believe that the smooth motion model-based MLBM approach represents a meaningful development in ultrasonic soft tissue motion measurement. PMID:9587997

  16. The Design of Model-Based Training Programs

    NASA Technical Reports Server (NTRS)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  17. A probabilistic choice model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2007-12-01

    Decision under risk and uncertainty (probabilistic choice) has been attracting attention in econophysics and neuroeconomics. This paper proposes a probabilistic choice model based on a mathematical equivalence of delay and uncertainty in decision-making, and the deformed algebra developed in the Tsallis’ non-extensive thermodynamics. Furthermore, it is shown that this model can be utilized to quantify the degree of consistency in probabilistic choice in humans and animals. Future directions in the application of the model to studies in econophysics, neurofinance, neuroeconomics, and social physics are discussed.

  18. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  19. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  20. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  1. A model-based multisensor data fusion knowledge management approach

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  2. Model-Based Information Extraction From Synthetic Aperture Radar Signals

    NASA Astrophysics Data System (ADS)

    Matzner, Shari A.

    2011-07-01

    Synthetic aperture radar (SAR) is a remote sensing technology for imaging areas of the earth's surface. SAR has been successfully used for monitoring characteristics of the natural environment such as land cover type and tree density. With the advent of higher resolution sensors, it is now theoretically possible to extract information about individual structures such as buildings from SAR imagery. This information could be used for disaster response and security-related intelligence. SAR has an advantage over other remote sensing technologies for these applications because SAR data can be collected during the night and in rainy or cloudy conditions. This research presents a model-based method for extracting information about a building -- its height and roof slope -- from a single SAR image. Other methods require multiple images or ancillary data from specialized sensors, making them less practical. The model-based method uses simulation to match a hypothesized building to an observed SAR image. The degree to which a simulation matches the observed data is measured by mutual information. The success of this method depends on the accuracy of the simulation and on the reliability of the mutual information similarity measure. Electromagnetic theory was applied to relate a building's physical characteristics to the features present in a SAR image. This understanding was used to quantify the precision of building information contained in SAR data, and to identify the inputs needed for accurate simulation. A new SAR simulation technique was developed to meet the accuracy and efficiency requirements of model-based information extraction. Mutual information, a concept from information theory, has become a standard for measuring the similarity between medical images. Its performance in the context of matching a simulation image to a SAR image was evaluated in this research, and it was found to perform well under certain conditions. The factors that affect its performance

  3. A model-based executive for commanding robot teams

    NASA Technical Reports Server (NTRS)

    Barrett, Anthony

    2005-01-01

    The paper presents a way to robustly command a system of systems as a single entity. Instead of modeling each component system in isolation and then manually crafting interaction protocols, this approach starts with a model of the collective population as a single system. By compiling the model into separate elements for each component system and utilizing a teamwork model for coordination, it circumvents the complexities of manually crafting robust interaction protocols. The resulting systems are both globally responsive by virtue of a team oriented interaction model and locally responsive by virtue of a distributed approach to model-based fault detection, isolation, and recovery.

  4. Kinetic modeling based probabilistic segmentation for molecular images.

    PubMed

    Saad, Ahmed; Hamarneh, Ghassan; Möller, Torsten; Smith, Ben

    2008-01-01

    We propose a semi-supervised, kinetic modeling based segmentation technique for molecular imaging applications. It is an iterative, self-learning algorithm based on uncertainty principles, designed to alleviate low signal-to-noise ratio (SNR) and partial volume effect (PVE) problems. Synthetic fluorodeoxyglucose (FDG) and simulated Raclopride dynamic positron emission tomography (dPET) brain images with excessive noise levels are used to validate our algorithm. We show, qualitatively and quantitatively, that our algorithm outperforms state-of-the-art techniques in identifying different functional regions and recovering the kinetic parameters.

  5. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  6. A parametric vocal fold model based on magnetic resonance imaging.

    PubMed

    Wu, Liang; Zhang, Zhaoyan

    2016-08-01

    This paper introduces a parametric three-dimensional body-cover vocal fold model based on magnetic resonance imaging (MRI) of the human larynx. Major geometric features that are observed in the MRI images but missing in current vocal fold models are discussed, and their influence on vocal fold vibration is evaluated using eigenmode analysis. Proper boundary conditions for the model are also discussed. Based on control parameters corresponding to anatomic landmarks that can be easily measured, this model can be adapted toward a subject-specific vocal fold model for voice production research and clinical applications. PMID:27586774

  7. Evaluating model accuracy for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Roden, Joseph

    1992-01-01

    Described here is an approach to automatically assessing the accuracy of various components of a model. In this approach, actual data from the operation of a target system is used to drive statistical measures to evaluate the prediction accuracy of various portions of the model. We describe how these statistical measures of model accuracy can be used in model-based reasoning for monitoring and design. We then describe the application of these techniques to the monitoring and design of the water recovery system of the Environmental Control and Life Support System (ECLSS) of Space Station Freedom.

  8. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  9. Fiber optic displacement measurement model based on finite reflective surface

    NASA Astrophysics Data System (ADS)

    Li, Yuhe; Guan, Kaisen; Hu, Zhaohui

    2016-10-01

    We present a fiber optic displacement measurement model based on finite reflective plate. The theoretical model was derived, and simulation analysis of light intensity distribution, reflective plate width, and the distance between fiber probe and reflective plate were conducted in details. The three dimensional received light intensity distribution and the characteristic curve of light intensity were studied as functions of displacement of finite reflective plate. Experiments were carried out to verify the established model. The physical fundamentals and the effect of operating parameters on measuring system performance were revealed in the end.

  10. Psychiatric stigma in correctional facilities.

    PubMed

    Miller, R D; Metzner, J L

    1994-01-01

    While legislatively sanctioned discrimination against the mentally ill in general society has largely disappeared, it persists in correctional systems where inmates are denied earn-time reductions in sentences, parole opportunities, placement in less restrictive facilities, and opportunities to participate in sentence-reducing programs because of their status as psychiatric patients or their need for psychotropic medications. The authors discuss the prevalence of such problems from detailed examinations of several correctional systems and from the results of a national survey of correctional medical directors.

  11. Evaluation of Model-Based Training for Vertical Guidance Logic

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.

  12. Phase-field elasticity model based on mechanical jump conditions

    NASA Astrophysics Data System (ADS)

    Schneider, Daniel; Tschukin, Oleg; Choudhury, Abhik; Selzer, Michael; Böhlke, Thomas; Nestler, Britta

    2015-05-01

    Computational models based on the phase-field method typically operate on a mesoscopic length scale and resolve structural changes of the material and furthermore provide valuable information about microstructure and mechanical property relations. An accurate calculation of the stresses and mechanical energy at the transition region is therefore indispensable. We derive a quantitative phase-field elasticity model based on force balance and Hadamard jump conditions at the interface. Comparing the simulated stress profiles calculated with Voigt/Taylor (Annalen der Physik 274(12):573, 1889), Reuss/Sachs (Z Angew Math Mech 9:49, 1929) and the proposed model with the theoretically predicted stress fields in a plate with a round inclusion under hydrostatic tension, we show the quantitative characteristics of the model. In order to validate the elastic contribution to the driving force for phase transition, we demonstrate the absence of excess energy, calculated by Durga et al. (Model Simul Mater Sci Eng 21(5):055018, 2013), in a one-dimensional equilibrium condition of serial and parallel material chains. To validate the driving force for systems with curved transition regions, we relate simulations to the Gibbs-Thompson equilibrium condition (Johnson and Alexander, J Appl Phys 59(8):2735, 1986).

  13. Towards model-based control of Parkinson's disease

    PubMed Central

    Schiff, Steven J.

    2010-01-01

    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson’s disease is gaining increasing acceptance. Thus, the confluence of these three developments—control theory, computational neuroscience and deep brain stimulation—offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson’s disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development. PMID:20368246

  14. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  15. Connectotyping: model based fingerprinting of the functional connectome.

    PubMed

    Miranda-Dominguez, Oscar; Mills, Brian D; Carpenter, Samuel D; Grant, Kathleen A; Kroenke, Christopher D; Nigg, Joel T; Fair, Damien A

    2014-01-01

    A better characterization of how an individual's brain is functionally organized will likely bring dramatic advances to many fields of study. Here we show a model-based approach toward characterizing resting state functional connectivity MRI (rs-fcMRI) that is capable of identifying a so-called "connectotype", or functional fingerprint in individual participants. The approach rests on a simple linear model that proposes the activity of a given brain region can be described by the weighted sum of its functional neighboring regions. The resulting coefficients correspond to a personalized model-based connectivity matrix that is capable of predicting the timeseries of each subject. Importantly, the model itself is subject specific and has the ability to predict an individual at a later date using a limited number of non-sequential frames. While we show that there is a significant amount of shared variance between models across subjects, the model's ability to discriminate an individual is driven by unique connections in higher order control regions in frontal and parietal cortices. Furthermore, we show that the connectotype is present in non-human primates as well, highlighting the translational potential of the approach.

  16. Model-Based Diagnostics for Propellant Loading Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.

    2011-01-01

    The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.

  17. Application of model based control to robotic manipulators

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  18. Internal wave signal processing: A model-based approach

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-02-22

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (depth) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. These models are then generalized to the stochastic case where an approximate Gauss-Markov theory applies. The resulting Gauss-Markov representation, in principle, allows the inclusion of stochastic phenomena such as noise and modeling errors in a consistent manner. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves. In particular, a processor is designed that allows in situ recursive estimation of the required velocity functions. Finally, it is shown that the associated residual or so-called innovation sequence that ensues from the recursive nature of this formulation can be employed to monitor the model`s fit to the data.

  19. Qualitative-Modeling-Based Silicon Neurons and Their Networks

    PubMed Central

    Kohno, Takashi; Sekikawa, Munehisa; Li, Jing; Nanami, Takuya; Aihara, Kazuyuki

    2016-01-01

    The ionic conductance models of neuronal cells can finely reproduce a wide variety of complex neuronal activities. However, the complexity of these models has prompted the development of qualitative neuron models. They are described by differential equations with a reduced number of variables and their low-dimensional polynomials, which retain the core mathematical structures. Such simple models form the foundation of a bottom-up approach in computational and theoretical neuroscience. We proposed a qualitative-modeling-based approach for designing silicon neuron circuits, in which the mathematical structures in the polynomial-based qualitative models are reproduced by differential equations with silicon-native expressions. This approach can realize low-power-consuming circuits that can be configured to realize various classes of neuronal cells. In this article, our qualitative-modeling-based silicon neuron circuits for analog and digital implementations are quickly reviewed. One of our CMOS analog silicon neuron circuits can realize a variety of neuronal activities with a power consumption less than 72 nW. The square-wave bursting mode of this circuit is explained. Another circuit can realize Class I and II neuronal activities with about 3 nW. Our digital silicon neuron circuit can also realize these classes. An auto-associative memory realized on an all-to-all connected network of these silicon neurons is also reviewed, in which the neuron class plays important roles in its performance. PMID:27378842

  20. Model-based patterns in stomach cancer mortality worldwide.

    PubMed

    Peleteiro, Bárbara; Severo, Milton; La Vecchia, Carlo; Lunet, Nuno

    2014-11-01

    The decrease in stomach cancer mortality was not because of specific interventions, and is likely that different countries follow a similar model of variation. Here, we aimed to identify model-based patterns in the time trends of stomach cancer mortality worldwide. Stomach cancer mortality rates were retrieved for 62 countries from the WHO mortality database. Sex-specific mixed models were used to describe time trends in age-standardized rates between 1980 and 2010 (age group 35-74 years; World standard population). Three patterns, similar for men and women, were identified through model-based clustering. Pattern 1 presented the highest mortality rates in 1980 (median: men, 81.5/100 000; women, 34.4/100 000) and pattern 3 the lowest ones (median: men, 24.4/100 000; women, 12.4/100 000). The decrease in mortality rates was greater in 1980-1995 than during 1996-2010. Assuming that the patterns characterized by the highest rates precede temporally those with lower mortality, the overlap of model predictions supports a 20-year lag between adjacent patterns. We propose a model for the variation in stomach cancer mortality with three stages that develop sequentially through a period of ∼70 years. The countries with the lowest mortality had the highest proportional decrease in mortality rates.

  1. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  2. MODEL-BASED CLUSTERING OF LARGE NETWORKS1

    PubMed Central

    Vu, Duy Q.; Hunter, David R.; Schweinberger, Michael

    2015-01-01

    We describe a network clustering framework, based on finite mixture models, that can be applied to discrete-valued networks with hundreds of thousands of nodes and billions of edge variables. Relative to other recent model-based clustering work for networks, we introduce a more flexible modeling framework, improve the variational-approximation estimation algorithm, discuss and implement standard error estimation via a parametric bootstrap approach, and apply these methods to much larger data sets than those seen elsewhere in the literature. The more flexible framework is achieved through introducing novel parameterizations of the model, giving varying degrees of parsimony, using exponential family models whose structure may be exploited in various theoretical and algorithmic ways. The algorithms are based on variational generalized EM algorithms, where the E-steps are augmented by a minorization-maximization (MM) idea. The bootstrapped standard error estimates are based on an efficient Monte Carlo network simulation idea. Last, we demonstrate the usefulness of the model-based clustering framework by applying it to a discrete-valued network with more than 131,000 nodes and 17 billion edge variables. PMID:26605002

  3. Model-based patterns in prostate cancer mortality worldwide

    PubMed Central

    Fontes, F; Severo, M; Castro, C; Lourenço, S; Gomes, S; Botelho, F; La Vecchia, C; Lunet, N

    2013-01-01

    Background: Prostate cancer mortality has been decreasing in several high income countries and previous studies analysed the trends mostly according to geographical criteria. We aimed to identify patterns in the time trends of prostate cancer mortality across countries using a model-based approach. Methods: Model-based clustering was used to identify patterns of variation in prostate cancer mortality (1980–2010) across 37 European, five non-European high-income countries and four leading emerging economies. We characterised the patterns observed regarding the geographical distribution and gross national income of the countries, as well as the trends observed in mortality/incidence ratios. Results: We identified three clusters of countries with similar variation in prostate cancer mortality: pattern 1 (‘no mortality decline'), characterised by a continued increase throughout the whole period; patterns 2 (‘later mortality decline') and 3 (‘earlier mortality decline') depict mortality declines, starting in the late and early 1990s, respectively. These clusters are also homogeneous regarding the variation in the prostate cancer mortality/incidence ratios, while are heterogeneous with reference to the geographical region of the countries and distribution of the gross national income. Conclusion: We provide a general model for the description and interpretation of the trends in prostate cancer mortality worldwide, based on three main patterns. PMID:23660943

  4. Integrating Model-Based Transmission Reduction into a multi-tier architecture

    NASA Astrophysics Data System (ADS)

    Straub, J.

    A multi-tier architecture consists of numerous craft as part of the system, orbital, aerial, and surface tiers. Each tier is able to collect progressively greater levels of information. Generally, craft from lower-level tiers are deployed to a target of interest based on its identification by a higher-level craft. While the architecture promotes significant amounts of science being performed in parallel, this may overwhelm the computational and transmission capabilities of higher-tier craft and links (particularly the deep space link back to Earth). Because of this, a new paradigm in in-situ data processing is required. Model-based transmission reduction (MBTR) is such a paradigm. Under MBTR, each node (whether a single spacecraft in orbit of the Earth or another planet or a member of a multi-tier network) is given an a priori model of the phenomenon that it is assigned to study. It performs activities to validate this model. If the model is found to be erroneous, corrective changes are identified, assessed to ensure their significance for being passed on, and prioritized for transmission. A limited amount of verification data is sent with each MBTR assertion message to allow those that might rely on the data to validate the correct operation of the spacecraft and MBTR engine onboard. Integrating MBTR with a multi-tier framework creates an MBTR hierarchy. Higher levels of the MBTR hierarchy task lower levels with data collection and assessment tasks that are required to validate or correct elements of its model. A model of the expected conditions is sent to the lower level craft; which then engages its own MBTR engine to validate or correct the model. This may include tasking a yet lower level of craft to perform activities. When the MBTR engine at a given level receives all of its component data (whether directly collected or from delegation), it randomly chooses some to validate (by reprocessing the validation data), performs analysis and sends its own results (v

  5. Novel therapeutic strategies for cardioprotection.

    PubMed

    Sluijter, Joost P G; Condorelli, Gianluigi; Davidson, Sean M; Engel, Felix B; Ferdinandy, Peter; Hausenloy, Derek J; Lecour, Sandrine; Madonna, Rosalinda; Ovize, Michel; Ruiz-Meana, Marisol; Schulz, Rainer; Van Laake, Linda W

    2014-10-01

    The morbidity and mortality from ischemic heart disease (IHD) remain significant worldwide. The treatment for acute myocardial infarction has improved over the past decades, including early reperfusion of occluded coronary arteries. Although it is essential to re-open the artery as soon as possible, paradoxically this leads to additional myocardial injury, called acute ischemia-reperfusion injury (IRI), for which currently no effective therapy is available. Therefore, novel therapeutic strategies are required to protect the heart from acute IRI in order to reduce myocardial infarction size, preserve cardiac function and improve clinical outcomes in patients with IHD. In this review article, we will first outline the pathophysiology of acute IRI and review promising therapeutic strategies for cardioprotection. These include novel aspects of mitochondrial function, epigenetics, circadian clocks, the immune system, microvesicles, growth factors, stem cell therapy and gene therapy. We discuss the therapeutic potential of these novel cardioprotective strategies in terms of pharmacological targeting and clinical application. PMID:24837132

  6. Two concepts of therapeutic optimism.

    PubMed

    Jansen, Lynn A

    2011-09-01

    Researchers and ethicists have long been concerned about the expectations for direct medical benefit expressed by participants in early phase clinical trials. Early work on the issue considered the possibility that participants misunderstand the purpose of clinical research or that they are misinformed about the prospects for medical benefit from these trials. Recently, however, attention has turned to the possibility that research participants are simply expressing optimism or hope about their participation in these trials. The ethical significance of this therapeutic optimism remains unclear. This paper argues that there are two distinct phenomena that can be associated with the term 'therapeutic optimism'-one is ethically benign and the other is potentially worrisome. Distinguishing these two phenomena is crucial for understanding the nature and ethical significance of therapeutic optimism. The failure to draw a distinction between these phenomena also helps to explain why different writers on the topic often speak past one another.

  7. Therapeutic cloning and reproductive liberty.

    PubMed

    Sparrow, Robert

    2009-04-01

    Concern for "reproductive liberty" suggests that decisions about embryos should normally be made by the persons who would be the genetic parents of the child that would be brought into existence if the embryo were brought to term. Therapeutic cloning would involve creating and destroying an embryo, which, if brought to term, would be the offspring of the genetic parents of the person undergoing therapy. I argue that central arguments in debates about parenthood and genetics therefore suggest that therapeutic cloning would be prima facie unethical unless it occurred with the consent of the parents of the person being cloned. Alternatively, if therapeutic cloning is thought to be legitimate, this undermines the case for some uses of reproductive cloning by implying that the genetic relation it establishes between clones and DNA donors does not carry the same moral weight as it does in cases of normal reproduction.

  8. Novel therapeutic strategies for cardioprotection.

    PubMed

    Sluijter, Joost P G; Condorelli, Gianluigi; Davidson, Sean M; Engel, Felix B; Ferdinandy, Peter; Hausenloy, Derek J; Lecour, Sandrine; Madonna, Rosalinda; Ovize, Michel; Ruiz-Meana, Marisol; Schulz, Rainer; Van Laake, Linda W

    2014-10-01

    The morbidity and mortality from ischemic heart disease (IHD) remain significant worldwide. The treatment for acute myocardial infarction has improved over the past decades, including early reperfusion of occluded coronary arteries. Although it is essential to re-open the artery as soon as possible, paradoxically this leads to additional myocardial injury, called acute ischemia-reperfusion injury (IRI), for which currently no effective therapy is available. Therefore, novel therapeutic strategies are required to protect the heart from acute IRI in order to reduce myocardial infarction size, preserve cardiac function and improve clinical outcomes in patients with IHD. In this review article, we will first outline the pathophysiology of acute IRI and review promising therapeutic strategies for cardioprotection. These include novel aspects of mitochondrial function, epigenetics, circadian clocks, the immune system, microvesicles, growth factors, stem cell therapy and gene therapy. We discuss the therapeutic potential of these novel cardioprotective strategies in terms of pharmacological targeting and clinical application.

  9. Enhancements of target detection using atmospheric correction preprocessing techniques in hyperspectral remote sensing

    NASA Astrophysics Data System (ADS)

    Yuen, Peter W. T.; Bishop, Gary J.

    2004-12-01

    This paper reports the result of a study on how atmospheric correction techniques (ACT) enhance target detection in hyperspectral remote sensing, using different sets of real data. Based on the data employed in this study, it has been shown that ACT can reduce the masking effect of the atmosphere and effectively improving spectral contrast. By using the standard Kmeans cluster based unsupervised classifier, it has been shown that the accuracy of the classification obtained from the atmospheric corrected data is almost an order of magnitude better than that achieved using the radiance data. This enhancement is entirely due to the improved separability of the classes in the atmospherically corrected data. Moreover, it has been found that intrinsic information concerning the nature of the imaged surface can be retrieved from the atmospherically corrected data. This has been done to within an error of 5% by using a model based atmospheric correction package ATCOR.

  10. Development of therapeutic HPV vaccines

    PubMed Central

    Trimble, Cornelia L; Frazer, Ian H

    2011-01-01

    At least 15% of human malignant diseases are attributable to the consequences of persistent viral or bacterial infection. Chronic infection with oncogenic human papillomavirus (HPV) types is a necessary, but insufficient, cause in the development of more cancers than any other virus. Currently available prophylactic vaccines have no therapeutic effect for established infection or for disease. Early disease is characterised by tissue sequestration. However, because a proportion of intraepithelial HPV-associated disease undergoes immune-mediated regression, the development of immunotherapeutic strategies is an opportunity to determine proof-of-principle for therapeutic vaccines. In this Review, we discuss recent progress in this field and priorities for future clinical investigations. PMID:19796749

  11. SUICIDE, PSYCHIATRISTS AND THERAPEUTIC ABORTION.

    PubMed

    ROSENBERG, A J; SILVER, E

    1965-06-01

    Pressures for interruption of pregnancy by therapeutic abortion constantly increase, both for liberalization of laws and for interpreting existing law more broadly. There are wide variations and inconsistencies in psychiatric attitudes and practices about therapeutic abortion. Follow-up patient data are scant, but necessary. Results of questionnaires indicate that such data can be obtained, and convey the impression that patients seem to manage after pregnancy, regardless of outcome, much as they had before pregnancy. This study indicates that the incidence of suicide in pregnant women is approximately one-sixth that of the rate for non-pregnant women in comparable age groups, implying that perhaps pregnancy has a psychically protective role.

  12. [Cerebral oedema: new therapeutic ways].

    PubMed

    Quintard, H; Ichai, C

    2014-06-01

    Cerebral oedema (CO) after brain injury can occur from different ways. The vasogenic and cytotoxic oedema are usually described but osmotic and hydrostatic CO, respectively secondary to plasmatic hypotonia or increase in blood pressure, can also be encountered. Addition of these several mechanisms can worsen injuries. Consequences are major, leading quickly to death secondary to intracerebral hypertension and later to neuropsychic sequelae. So therapeutic care to control this phenomenon is essential and osmotherapy is actually the only way. A better understanding of physiopathological disorders, particularly energetic ways (lactate), aquaporine function, inflammation lead to new therapeutic hopes. The promising experimental results need now to be confirmed by clinical data.

  13. Correction method for line extraction in vision measurement.

    PubMed

    Shao, Mingwei; Wei, Zhenzhong; Hu, Mengjie; Zhang, Guangjun

    2015-01-01

    Over-exposure and perspective distortion are two of the main factors underlying inaccurate feature extraction. First, based on Steger's method, we propose a method for correcting curvilinear structures (lines) extracted from over-exposed images. A new line model based on the Gaussian line profile is developed, and its description in the scale space is provided. The line position is analytically determined by the zero crossing of its first-order derivative, and the bias due to convolution with the normal Gaussian kernel function is eliminated on the basis of the related description. The model considers over-exposure features and is capable of detecting the line position in an over-exposed image. Simulations and experiments show that the proposed method is not significantly affected by the exposure level and is suitable for correcting lines extracted from an over-exposed image. In our experiments, the corrected result is found to be more precise than the uncorrected result by around 45.5%. Second, we analyze perspective distortion, which is inevitable during line extraction owing to the projective camera model. The perspective distortion can be rectified on the basis of the bias introduced as a function of related parameters. The properties of the proposed model and its application to vision measurement are discussed. In practice, the proposed model can be adopted to correct line extraction according to specific requirements by employing suitable parameters.

  14. Correction Method for Line Extraction in Vision Measurement

    PubMed Central

    Shao, Mingwei; Wei, Zhenzhong; Hu, Mengjie; Zhang, Guangjun

    2015-01-01

    Over-exposure and perspective distortion are two of the main factors underlying inaccurate feature extraction. First, based on Steger’s method, we propose a method for correcting curvilinear structures (lines) extracted from over-exposed images. A new line model based on the Gaussian line profile is developed, and its description in the scale space is provided. The line position is analytically determined by the zero crossing of its first-order derivative, and the bias due to convolution with the normal Gaussian kernel function is eliminated on the basis of the related description. The model considers over-exposure features and is capable of detecting the line position in an over-exposed image. Simulations and experiments show that the proposed method is not significantly affected by the exposure level and is suitable for correcting lines extracted from an over-exposed image. In our experiments, the corrected result is found to be more precise than the uncorrected result by around 45.5%. Second, we analyze perspective distortion, which is inevitable during line extraction owing to the projective camera model. The perspective distortion can be rectified on the basis of the bias introduced as a function of related parameters. The properties of the proposed model and its application to vision measurement are discussed. In practice, the proposed model can be adopted to correct line extraction according to specific requirements by employing suitable parameters. PMID:25984762

  15. An Accurate Temperature Correction Model for Thermocouple Hygrometers 1

    PubMed Central

    Savage, Michael J.; Cass, Alfred; de Jager, James M.

    1982-01-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques. In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38°C). The model based on calibration at two temperatures is superior to that based on only one calibration. The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25°C, if the calibration slopes are corrected for temperature. PMID:16662241

  16. An accurate temperature correction model for thermocouple hygrometers.

    PubMed

    Savage, M J; Cass, A; de Jager, J M

    1982-02-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques.In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38 degrees C). The model based on calibration at two temperatures is superior to that based on only one calibration.The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25 degrees C, if the calibration slopes are corrected for temperature.

  17. Correction method for line extraction in vision measurement.

    PubMed

    Shao, Mingwei; Wei, Zhenzhong; Hu, Mengjie; Zhang, Guangjun

    2015-01-01

    Over-exposure and perspective distortion are two of the main factors underlying inaccurate feature extraction. First, based on Steger's method, we propose a method for correcting curvilinear structures (lines) extracted from over-exposed images. A new line model based on the Gaussian line profile is developed, and its description in the scale space is provided. The line position is analytically determined by the zero crossing of its first-order derivative, and the bias due to convolution with the normal Gaussian kernel function is eliminated on the basis of the related description. The model considers over-exposure features and is capable of detecting the line position in an over-exposed image. Simulations and experiments show that the proposed method is not significantly affected by the exposure level and is suitable for correcting lines extracted from an over-exposed image. In our experiments, the corrected result is found to be more precise than the uncorrected result by around 45.5%. Second, we analyze perspective distortion, which is inevitable during line extraction owing to the projective camera model. The perspective distortion can be rectified on the basis of the bias introduced as a function of related parameters. The properties of the proposed model and its application to vision measurement are discussed. In practice, the proposed model can be adopted to correct line extraction according to specific requirements by employing suitable parameters. PMID:25984762

  18. An accurate temperature correction model for thermocouple hygrometers.

    PubMed

    Savage, M J; Cass, A; de Jager, J M

    1982-02-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques.In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38 degrees C). The model based on calibration at two temperatures is superior to that based on only one calibration.The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25 degrees C, if the calibration slopes are corrected for temperature. PMID:16662241

  19. 21 CFR 890.5975 - Therapeutic vibrator.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Therapeutic vibrator. 890.5975 Section 890.5975...) MEDICAL DEVICES PHYSICAL MEDICINE DEVICES Physical Medicine Therapeutic Devices § 890.5975 Therapeutic vibrator. (a) Identification. A therapeutic vibrator is an electrically powered device intended for...

  20. Process proximity correction using an automated software tool

    NASA Astrophysics Data System (ADS)

    Maurer, Wilhelm; Dolainsky, Christoph; Thiele, Joerg; Friedrich, Christoph M.; Karakatsanis, Paul

    1998-06-01

    The pattern transfer process from the chip layout data to the structures on the finished wafer consists of many process steps. Although desired, none of these steps is linear in all aspects of the pattern transfer. Approaching the process limits due to the ever-shrinking linewidth, the non- linearities of the pattern transfer clearly show up. This means, that one cannot continue the practice to summarize all process influences into one bias between the data used for mask making and the final chip structure. The correction of process non-linearities is a necessity. This correction is usually called optical proximity correction (OPC), although not all effects intended for correction are of optical origin and/or not all these are effects of the neighborhood. We therefore propose to use the term PPC (process proximity correction). This paper reports our experiences with the application of OPTISSIMO, a software tool developed to perform automatically OPC/PPC for full chip designs. First, we provide a definition of PPC, which in our view has to correct all non- linearities of the pattern transfer process from layout data to the final electrically measured structures. Then, the strategy of the OPC/PPC tool OPTISSIMO, a software package to perform PPC based on process simulation, is discussed. We focus on the data handling strategy and on the process modeling of the tool under evaluation. It is shown, that full chip OPC/PPC is practicable using a well-designed hierarchy management system combined with a pattern library. Finally, it is demonstrated, that a model-based OPC/PPC tool is by definition a process simulation tool, that is able to perform all simulation tasks (like defect printability) at reasonable accuracy.

  1. Novel therapeutic approaches for haemophilia.

    PubMed

    Shetty, S; Ghosh, K

    2015-03-01

    The major therapy for haemophilia is plasma derived or recombinant clotting factors which are evolving steadily to increase potency, stability and half-life. Research in the area of haemophilia therapeutics, however, is not restricted only to modifications in the recombinant products, but alternate therapeutic strategies are being developed which are in different phases of experimental and clinical trials. This chapter reviews the diverse molecular innovations which are being developed for alternate therapeutic approaches in haemophilia. The data is mainly extracted from the literature and the Conference abstracts. Some of the novel therapeutic approaches include inhibition of anticoagulant pathway factors (activated protein C, antithrombin, tissue factor pathway inhibitor) by monoclonal antibodies, peptide inhibitors, DNA or RNA aptamers, use of variant coagulation factors (factor Xa, factor Va) which are more resistant to inactivation or enzymatically more active and antibody-mediated therapy including a humanized anti-factor IXa/X bispecific antibody mimicking factor VIII. Other approaches include nonsense mutation suppression, induction of prothrombotic microparticles by P-selectin-immunoglobulin chimeras, suppression of fibrinolytic potential either by antifibrinolytics or by the use of mutant molecules of fibrinolytic inhibitors. Few products are proposed as 'stand alone' treatment for haemophilia, while a few can be used as adjuvant therapies to recombinant factors with an aim to reduce the amount of factor intake. All efforts are underway to produce an alternate, novel drug for haemophilia which will have an increased half-life, subcutaneously injectable, non-immunogenic and effective both in the presence and absence of inhibitors.

  2. The Diversity of Therapeutic Change.

    ERIC Educational Resources Information Center

    Tyler, Forrest B.; Gatz, Margaret

    A study of high school group counseling examined diversity of therapeutic outcome in relation to (a) subject characteristics (race, sex, and exemplary-marginal school status) and (b) counselor training. Counseling produced differential changes in self-efficacy, trust, and coping approach, for different subject groups, with trained counselors…

  3. Overview of Therapeutic Drug Monitoring

    PubMed Central

    Lee, Min-Ho

    2009-01-01

    Therapeutic drug monitoring (TDM) is the clinical practice of measuring specific drugs at designated intervals to maintain a constant concentration in a patient's bloodstream, thereby optimizing individual dosage regimens. It is unnecessary to employ TDM for the majority of medications, and it is used mainly for monitoring drugs with narrow therapeutic ranges, drugs with marked pharmacokinetic variability, medications for which target concentrations are difficult to monitor, and drugs known to cause therapeutic and adverse effects. The process of TDM is predicated on the assumption that there is a definable relationship between dose and plasma or blood drug concentration, and between concentration and therapeutic effects. TDM begins when the drug is first prescribed, and involves determining an initial dosage regimen appropriate for the clinical condition and such patient characteristics as age, weight, organ function, and concomitant drug therapy. When interpreting concentration measurements, factors that need to be considered include the sampling time in relation to drug dose, dosage history, patient response, and the desired medicinal targets. The goal of TDM is to use appropriate concentrations of difficult-to-manage medications to optimize clinical outcomes in patients in various clinical situations. PMID:19270474

  4. Therapeutic role of dietary fibre.

    PubMed Central

    Hunt, R.; Fedorak, R.; Frohlich, J.; McLennan, C.; Pavilanis, A.

    1993-01-01

    The current status of dietary fibre and fibre supplements in health and disease is reported, and the components of dietary fibre and its respective mechanical and metabolic effects with emphasis on its therapeutic potential are reviewed. Practical management guidelines are provided to help physicians encourage patients identified as having fibre deficiency to increase dietary fibre intake to the recommended level. PMID:8388284

  5. [Therapeutic education, approaches in psychiatry].

    PubMed

    Jouet, Emmanuelle

    2011-01-01

    Therapeutic patient education offers people suffering from chronic illnesses new therapies as well as an appropriation of knowledge of the disease. It has a special place in psychiatric nursing care provision. However, programmes offered by nursing teams or pharmaceutical laboratories are struggling to define themselves.

  6. Scenario Writing: A Therapeutic Application.

    ERIC Educational Resources Information Center

    Haddock, Billy D.

    1989-01-01

    Introduces scenario writing as useful therapeutic technique. Presents case study of woman in midst of divorce and custody fight to illustrate context in which technique was applied. Suggests additional applications. Concludes that good response is more likely for clients who possess good writing skills although other clients may use their own…

  7. A social discounting model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2010-09-01

    Social decision making (e.g. social discounting and social preferences) has been attracting attention in economics, econophysics, social physics, behavioral psychology, and neuroeconomics. This paper proposes a novel social discounting model based on the deformed algebra developed in the Tsallis’ non-extensive thermostatistics. Furthermore, it is suggested that this model can be utilized to quantify the degree of consistency in social discounting in humans and analyze the relationships between behavioral tendencies in social discounting and other-regarding economic decision making under game-theoretic conditions. Future directions in the application of the model to studies in econophysics, neuroeconomics, and social physics, as well as real-world problems such as the supply of live organ donations, are discussed.

  8. Model-based parameterisation of a hydrocyclone air-core

    PubMed

    Podd; Schlaberg; Hoyle

    2000-03-01

    An important metric for the accurate control of a hydrocyclone is the diameter of its air-core. Ultrasonic data from a 16-transducer, 1.5 MHz pulse-echo tomographic system are analysed to determine the variation of the air-core diameter with various operating conditions. The back-projection image reconstruction method is not accurate enough for this task. Sub-millimetre accuracy is obtained, however, by applying a combination of signal processing and model-based reconstruction, using the fact that there is a small variation in the air-core boundary position. The findings correspond well to the results obtained from X-ray and electrical resistance modalities.

  9. On the Performance of Stochastic Model-Based Image Segmentation

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Sewchand, Wilfred

    1989-11-01

    A new stochastic model-based image segmentation technique for X-ray CT image has been developed and has been extended to the more general nondiffraction CT images which include MRI, SPELT, and certain type of ultrasound images [1,2]. The nondiffraction CT image is modeled by a Finite Normal Mixture. The technique utilizes the information theoretic criterion to detect the number of the region images, uses the Expectation-Maximization algorithm to estimate the parameters of the image, and uses the Bayesian classifier to segment the observed image. How does this technique over/under-estimate the number of the region images? What is the probability of errors in the segmentation of this technique? This paper addresses these two problems and is a continuation of [1,2].

  10. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  11. The ubiquity of model-based reinforcement learning.

    PubMed

    Doll, Bradley B; Simon, Dylan A; Daw, Nathaniel D

    2012-12-01

    The reward prediction error (RPE) theory of dopamine (DA) function has enjoyed great success in the neuroscience of learning and decision-making. This theory is derived from model-free reinforcement learning (RL), in which choices are made simply on the basis of previously realized rewards. Recently, attention has turned to correlates of more flexible, albeit computationally complex, model-based methods in the brain. These methods are distinguished from model-free learning by their evaluation of candidate actions using expected future outcomes according to a world model. Puzzlingly, signatures from these computations seem to be pervasive in the very same regions previously thought to support model-free learning. Here, we review recent behavioral and neural evidence about these two systems, in attempt to reconcile their enigmatic cohabitation in the brain.

  12. Performability modeling based on real data: A case study

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1988-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of apparent types of errors.

  13. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  14. CDMBE: A Case Description Model Based on Evidence.

    PubMed

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  15. The algorithmic anatomy of model-based evaluation

    PubMed Central

    Daw, Nathaniel D.; Dayan, Peter

    2014-01-01

    Despite many debates in the first half of the twentieth century, it is now largely a truism that humans and other animals build models of their environments and use them for prediction and control. However, model-based (MB) reasoning presents severe computational challenges. Alternative, computationally simpler, model-free (MF) schemes have been suggested in the reinforcement learning literature, and have afforded influential accounts of behavioural and neural data. Here, we study the realization of MB calculations, and the ways that this might be woven together with MF values and evaluation methods. There are as yet mostly only hints in the literature as to the resulting tapestry, so we offer more preview than review. PMID:25267820

  16. Model-based estimation for dynamic cardiac studies using ECT

    SciTech Connect

    Chiao, P.C.; Rogers, W.L.; Clinthorne, N.H.; Fessler, J.A.; Hero, A.O. . Div. of Nuclear Medicine)

    1994-06-01

    In this paper, the authors develop a strategy for joint estimation of physiological parameters and myocardial boundaries using ECT (Emission Computed Tomography). The authors construct an observation model to relate parameters of interest to the projection data and to account for limited ECT system resolution and measurement noise. The authors then use a maximum likelihood (ML) estimator to jointly estimate all the parameters directly from the projection data without reconstruction of intermediate images. The authors also simulate myocardial perfusion studies based on a simplified heart model to evaluate the performance of the model-based joint ML estimator and compare this performance to the Cramer-Rao lower bound. Finally, model assumptions and potential uses of the joint estimation strategy are discussed.

  17. Model-based advanced process control of coagulation.

    PubMed

    Baxter, C W; Shariff, R; Stanley, S J; Smith, D W; Zhang, Q; Saumer, E D

    2002-01-01

    The drinking water treatment industry has seen a recent increase in the use of artificial neural networks (ANNs) for process modelling and offline process control tools and applications. While conceptual frameworks for integrating the ANN technology into the real-time control of complex treatment processes have been proposed, actual working systems have yet to be developed. This paper presents development and application of an ANN model-based advanced process control system for the coagulation process at a pilot-scale water treatment facility in Edmonton, Alberta, Canada. The system was successfully used to maintain a user-defined set point for effluent quality, by automatically varying operating conditions in response to changes in influent water quality. This new technology has the potential to realize significant operational cost saving for utilities when applied in full-scale applications.

  18. Qualitative model-based diagnostics for rocket systems

    NASA Technical Reports Server (NTRS)

    Maul, William; Meyer, Claudia; Jankovsky, Amy; Fulton, Christopher

    1993-01-01

    A diagnostic software package is currently being developed at NASA LeRC that utilizes qualitative model-based reasoning techniques. These techniques can provide diagnostic information about the operational condition of the modeled rocket engine system or subsystem. The diagnostic package combines a qualitative model solver with a constraint suspension algorithm. The constraint suspension algorithm directs the solver's operation to provide valuable fault isolation information about the modeled system. A qualitative model of the Space Shuttle Main Engine's oxidizer supply components was generated. A diagnostic application based on this qualitative model was constructed to process four test cases: three numerical simulations and one actual test firing. The diagnostic tool's fault isolation output compared favorably with the input fault condition.

  19. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  20. Model-Based Systems Engineering Pilot Program at NASA Langley

    NASA Technical Reports Server (NTRS)

    Vipavetz, Kevin G.; Murphy, Douglas G.; Infeld, Samatha I.

    2012-01-01

    NASA Langley Research Center conducted a pilot program to evaluate the benefits of using a Model-Based Systems Engineering (MBSE) approach during the early phase of the Materials International Space Station Experiment-X (MISSE-X) project. The goal of the pilot was to leverage MBSE tools and methods, including the Systems Modeling Language (SysML), to understand the net gain of utilizing this approach on a moderate size flight project. The System Requirements Review (SRR) success criteria were used to guide the work products desired from the pilot. This paper discusses the pilot project implementation, provides SysML model examples, identifies lessons learned, and describes plans for further use on MBSE on MISSE-X.

  1. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  2. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  3. Correctness issues in workflow management

    NASA Astrophysics Data System (ADS)

    Kamath, Mohan; Ramamritham, Krithi

    1996-12-01

    Workflow management is a technique to integrate and automate the execution of steps that comprise a complex process, e.g., a business process. Workflow management systems (WFMSs) primarily evolved from industry to cater to the growing demand for office automation tools among businesses. Coincidentally, database researchers developed several extended transaction models to handle similar applications. Although the goals of both the communities were the same, the issues they focused on were different. The workflow community primarily focused on modelling aspects to accurately capture the data and control flow requirements between the steps that comprise a workflow, while the database community focused on correctness aspects to ensure data consistency of sub-transactions that comprise a transaction. However, we now see a confluence of some of the ideas, with additional features being gradually offered by WFMSs. This paper provides an overview of correctness in workflow management. Correctness is an important aspect of WFMSs and a proper understanding of the available concepts and techniques by WFMS developers and workflow designers will help in building workflows that are flexible enough to capture the requirements of real world applications and robust enough to provide the necessary correctness and reliability properties. We first enumerate the correctness issues that have to be considered to ensure data consistency. Then we survey techniques that have been proposed or are being used in WFMSs for ensuring correctness of workflows. These techniques emerge from the areas of workflow management, extended transaction models, multidatabases and transactional workflows. Finally, we present some open issues related to correctness of workflows in the presence of concurrency and failures.

  4. Toward a model-based cognitive neuroscience of mind wandering.

    PubMed

    Hawkins, G E; Mittner, M; Boekel, W; Heathcote, A; Forstmann, B U

    2015-12-01

    People often "mind wander" during everyday tasks, temporarily losing track of time, place, or current task goals. In laboratory-based tasks, mind wandering is often associated with performance decrements in behavioral variables and changes in neural recordings. Such empirical associations provide descriptive accounts of mind wandering - how it affects ongoing task performance - but fail to provide true explanatory accounts - why it affects task performance. In this perspectives paper, we consider mind wandering as a neural state or process that affects the parameters of quantitative cognitive process models, which in turn affect observed behavioral performance. Our approach thus uses cognitive process models to bridge the explanatory divide between neural and behavioral data. We provide an overview of two general frameworks for developing a model-based cognitive neuroscience of mind wandering. The first approach uses neural data to segment observed performance into a discrete mixture of latent task-related and task-unrelated states, and the second regresses single-trial measures of neural activity onto structured trial-by-trial variation in the parameters of cognitive process models. We discuss the relative merits of the two approaches, and the research questions they can answer, and highlight that both approaches allow neural data to provide additional constraint on the parameters of cognitive models, which will lead to a more precise account of the effect of mind wandering on brain and behavior. We conclude by summarizing prospects for mind wandering as conceived within a model-based cognitive neuroscience framework, highlighting the opportunities for its continued study and the benefits that arise from using well-developed quantitative techniques to study abstract theoretical constructs.

  5. Neural mass model-based tracking of anesthetic brain states.

    PubMed

    Kuhlmann, Levin; Freestone, Dean R; Manton, Jonathan H; Heyse, Bjorn; Vereecke, Hugo E M; Lipping, Tarmo; Struys, Michel M R F; Liley, David T J

    2016-06-01

    Neural mass model-based tracking of brain states from electroencephalographic signals holds the promise of simultaneously tracking brain states while inferring underlying physiological changes in various neuroscientific and clinical applications. Here, neural mass model-based tracking of brain states using the unscented Kalman filter applied to estimate parameters of the Jansen-Rit cortical population model is evaluated through the application of propofol-based anesthetic state monitoring. In particular, 15 subjects underwent propofol anesthesia induction from awake to anesthetised while behavioral responsiveness was monitored and frontal electroencephalographic signals were recorded. The unscented Kalman filter Jansen-Rit model approach applied to frontal electroencephalography achieved reasonable testing performance for classification of the anesthetic brain state (sensitivity: 0.51; chance sensitivity: 0.17; nearest neighbor sensitivity 0.75) when compared to approaches based on linear (autoregressive moving average) modeling (sensitivity 0.58; nearest neighbor sensitivity: 0.91) and a high performing standard depth of anesthesia monitoring measure, Higuchi Fractal Dimension (sensitivity: 0.50; nearest neighbor sensitivity: 0.88). Moreover, it was found that the unscented Kalman filter based parameter estimates of the inhibitory postsynaptic potential amplitude varied in the physiologically expected direction with increases in propofol concentration, while the estimates of the inhibitory postsynaptic potential rate constant did not. These results combined with analysis of monotonicity of parameter estimates, error analysis of parameter estimates, and observability analysis of the Jansen-Rit model, along with considerations of extensions of the Jansen-Rit model, suggests that the Jansen-Rit model combined with unscented Kalman filtering provides a valuable reference point for future real-time brain state tracking studies. This is especially true for studies of

  6. Model-Based Engineering and Manufacturing CAD/CAM Benchmark.

    SciTech Connect

    Domm, T.C.; Underwood, R.S.

    1999-10-13

    The Benchmark Project was created from a desire to identify best practices and improve the overall efficiency and performance of the Y-12 Plant's systems and personnel supporting the manufacturing mission. The mission of the benchmark team was to search out industry leaders in manufacturing and evaluate their engineering practices and processes to determine direction and focus for Y-12 modernization efforts. The companies visited included several large established companies and a new, small, high-tech machining firm. As a result of this effort, changes are recommended that will enable Y-12 to become a more modern, responsive, cost-effective manufacturing facility capable of supporting the needs of the Nuclear Weapons Complex (NWC) into the 21st century. The benchmark team identified key areas of interest, both focused and general. The focus areas included Human Resources, Information Management, Manufacturing Software Tools, and Standards/Policies and Practices. Areas of general interest included Infrastructure, Computer Platforms and Networking, and Organizational Structure. The results of this benchmark showed that all companies are moving in the direction of model-based engineering and manufacturing. There was evidence that many companies are trying to grasp how to manage current and legacy data. In terms of engineering design software tools, the companies contacted were somewhere between 3-D solid modeling and surfaced wire-frame models. The manufacturing computer tools were varied, with most companies using more than one software product to generate machining data and none currently performing model-based manufacturing (MBM) from a common model. The majority of companies were closer to identifying or using a single computer-aided design (CAD) system than a single computer-aided manufacturing (CAM) system. The Internet was a technology that all companies were looking to either transport information more easily throughout the corporation or as a conduit for

  7. Level 2 Therapeutic Model site.

    PubMed

    Spears, Brad; Sanchez, David; Bishop, Jane; Rogers, Sharon; DeJong, Judith A

    2006-01-01

    L2, one of the original sites first funded under the Therapeutic Residential Model Initiative in 2001-2002, is operated as a peripheral dormitory. This dormitory cares for 185 boys and girls in grades 1-12 who attend local public schools. L2 presented an outstanding proposal which identified gaps in services and presented a reasonable budget to address those gaps by adding additional mental health services and increasing the number of residential and recreation staff. With only minor modifications to this budget, the site efficiently and effectively implemented the strategies it had proposed and utilized evaluation feedback to fine-tune systems and maximize positive outcomes. The Therapeutic Residential Model funds enabled the site to move from a functional dormitory to a therapeutic residential situation where the needs of students are assessed and addressed. Outcome indicators in spring 2002, 2003, 2004, and 2005 showed impacts in a number of areas when compared with the baseline year of 2000-2001: Retention of students steadily increased going from 40.7% in 2000-2001 to 68.4% in 2004-2005; 75 students graduated from high school during the four Therapeutic Residential Model years, compared with 41 in the preceding four years; Academic Proficiency and ACT scores improved significantly; Thirty-day cigarette use dropped from 62% in spring 2001 to 38% in spring 2005 among 7th and 8th graders, from 58% to 33% among 9th and 10th graders, and from 72% to 29% among 11th and 12th graders; Alienation indices showed an increase in feelings of inclusion and a decrease in lack of meaning. This site is an outstanding example of what can be done with a well-designed and responsibly implemented Therapeutic Model Program, and the measurable impacts which can result from such strategic use of resources. PMID:17602401

  8. New Therapeutic Approaches to Modulate and Correct Cystic Fibrosis Transmembrane Conductance Regulator.

    PubMed

    Ong, Thida; Ramsey, Bonnie W

    2016-08-01

    Cystic fibrosis transmembrane conductance regulator (CFTR) modulators are clinically available personalized medicines approved for some individuals with cystic fibrosis (CF) to target the underlying defect of disease. This review summarizes strategies used to develop CFTR modulators as therapies that improve function and availability of CFTR protein. Lessons learned from dissemination of ivacaftor across the CF population responsive to this therapy and future approaches to predict and monitor treatment response of CFTR modulators are discussed. The goal remains to expand patient-centered and personalized therapy to all patients with CF, ultimately improving life expectancy and quality of life for this disease. PMID:27469186

  9. Toward a Model-Based Approach to Flight System Fault Protection

    NASA Technical Reports Server (NTRS)

    Day, John; Murray, Alex; Meakin, Peter

    2012-01-01

    Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.

  10. Enforcement of Mask Rule Compliance in Model-Based OPC'ed Layouts during Data Preparation

    NASA Astrophysics Data System (ADS)

    Meyer, Dirk H.; Vuletic, Radovan; Seidl, Alexander

    2002-12-01

    Currently available commercial model-based OPC tools do not always generate layouts which are mask rule compliant. Additional processing is required to remove mask rule violations, which are often too numerous for manual patching. Although physical verification tools can be used to remove simple mask rule violations, the results are often unsatisfactory for more complicated geometrical configurations. The subject of this paper is the development and application of a geometrical processing engine that automatically enforces mask rule compliance of the OPC'ed layout. It is designed as an add-on to a physical verification tool. The engine constructs patches, which remove mask rule violations such as notches or width violations. By employing a Mixed Integer Programming (MIP) optimization method, the edges of each patch are placed in a way that avoids secondary violations while modifying the OPC'ed layout as little as possible. A sequence of enforcement steps is applied to the layout to remove all types of mask rule violations. This approach of locally confined minimal layout modifications retains OPC corrections to a maximum amount. This method has been used successfully in production on a variety of DRAM designs for the non-array regions.

  11. A Gaussian Mixture Model-Based Continuous Boundary Detection for 3D Sensor Networks

    PubMed Central

    Chen, Jiehui; Salim, Mariam B.; Matsumoto, Mitsuji

    2010-01-01

    This paper proposes a high precision Gaussian Mixture Model-based novel Boundary Detection 3D (BD3D) scheme with reasonable implementation cost for 3D cases by selecting a minimum number of Boundary sensor Nodes (BNs) in continuous moving objects. It shows apparent advantages in that two classes of boundary and non-boundary sensor nodes can be efficiently classified using the model selection techniques for finite mixture models; furthermore, the set of sensor readings within each sensor node’s spatial neighbors is formulated using a Gaussian Mixture Model; different from DECOMO [1] and COBOM [2], we also formatted a BN Array with an additional own sensor reading to benefit selecting Event BNs (EBNs) and non-EBNs from the observations of BNs. In particular, we propose a Thick Section Model (TSM) to solve the problem of transition between 2D and 3D. It is verified by simulations that the BD3D 2D model outperforms DECOMO and COBOM in terms of average residual energy and the number of BNs selected, while the BD3D 3D model demonstrates sound performance even for sensor networks with low densities especially when the value of the sensor transmission range (r) is larger than the value of Section Thickness (d) in TSM. We have also rigorously proved its correctness for continuous geometric domains and full robustness for sensor networks over 3D terrains. PMID:22163619

  12. Model-Based Diagnosis and Prognosis of a Water Recycling System

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Hafiychuk, Vasyl; Goebel, Kai Frank

    2013-01-01

    A water recycling system (WRS) deployed at NASA Ames Research Center s Sustainability Base (an energy efficient office building that integrates some novel technologies developed for space applications) will serve as a testbed for long duration testing of next generation spacecraft water recycling systems for future human spaceflight missions. This system cleans graywater (waste water collected from sinks and showers) and recycles it into clean water. Like all engineered systems, the WRS is prone to standard degradation due to regular use, as well as other faults. Diagnostic and prognostic applications will be deployed on the WRS to ensure its safe, efficient, and correct operation. The diagnostic and prognostic results can be used to enable condition-based maintenance to avoid unplanned outages, and perhaps extend the useful life of the WRS. Diagnosis involves detecting when a fault occurs, isolating the root cause of the fault, and identifying the extent of damage. Prognosis involves predicting when the system will reach its end of life irrespective of whether an abnormal condition is present or not. In this paper, first, we develop a physics model of both nominal and faulty system behavior of the WRS. Then, we apply an integrated model-based diagnosis and prognosis framework to the simulation model of the WRS for several different fault scenarios to detect, isolate, and identify faults, and predict the end of life in each fault scenario, and present the experimental results.

  13. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  14. Therapeutic activity of modified U1 core spliceosomal particles

    PubMed Central

    Rogalska, Malgorzata Ewa; Tajnik, Mojca; Licastro, Danilo; Bussani, Erica; Camparini, Luca; Mattioli, Chiara; Pagani, Franco

    2016-01-01

    Modified U1 snRNAs bound to intronic sequences downstream of the 5′ splice site correct exon skipping caused by different types of mutations. Here we evaluate the therapeutic activity and structural requirements of these exon-specific U1 snRNA (ExSpeU1) particles. In a severe spinal muscular atrophy, mouse model, ExSpeU1, introduced by germline transgenesis, increases SMN2 exon 7 inclusion, SMN protein production and extends life span. In vitro, RNA mutant analysis and silencing experiments show that while U1A protein is dispensable, the 70K and stem loop IV elements mediate most of the splicing rescue activity through improvement of exon and intron definition. Our findings indicate that precise engineering of the U1 core spliceosomal RNA particle has therapeutic potential in pathologies associated with exon-skipping mutations. PMID:27041075

  15. Therapeutic activity of modified U1 core spliceosomal particles.

    PubMed

    Rogalska, Malgorzata Ewa; Tajnik, Mojca; Licastro, Danilo; Bussani, Erica; Camparini, Luca; Mattioli, Chiara; Pagani, Franco

    2016-01-01

    Modified U1 snRNAs bound to intronic sequences downstream of the 5' splice site correct exon skipping caused by different types of mutations. Here we evaluate the therapeutic activity and structural requirements of these exon-specific U1 snRNA (ExSpeU1) particles. In a severe spinal muscular atrophy, mouse model, ExSpeU1, introduced by germline transgenesis, increases SMN2 exon 7 inclusion, SMN protein production and extends life span. In vitro, RNA mutant analysis and silencing experiments show that while U1A protein is dispensable, the 70K and stem loop IV elements mediate most of the splicing rescue activity through improvement of exon and intron definition. Our findings indicate that precise engineering of the U1 core spliceosomal RNA particle has therapeutic potential in pathologies associated with exon-skipping mutations. PMID:27041075

  16. An approach to model validation and model-based prediction -- polyurethane foam case study.

    SciTech Connect

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are

  17. Delegation in Correctional Nursing Practice.

    PubMed

    Tompkins, Frances

    2016-07-01

    Correctional nurses face daily challenges as a result of their work environment. Common challenges include availability of resources for appropriate care delivery, negotiating with custody staff for access to patients, adherence to scope of practice standards, and working with a varied staffing mix. Professional correctional nurses must consider the educational backgrounds and competency of other nurses and assistive personnel in planning for care delivery. Budgetary constraints and varied staff preparation can be a challenge for the professional nurse. Adequate care planning requires understanding the educational level and competency of licensed and unlicensed staff. Delegation is the process of assessing patient needs and transferring responsibility for care to appropriately educated and competent staff. Correctional nurses can benefit from increased knowledge about delegation. PMID:27302707

  18. String-Corrected Black Holes

    SciTech Connect

    Hubeny, Veronika; Maloney, Alexander; Rangamani, Mukund

    2005-02-07

    We investigate the geometry of four dimensional black hole solutions in the presence of stringy higher curvature corrections to the low energy effective action. For certain supersymmetric two charge black holes these corrections drastically alter the causal structure of the solution, converting seemingly pathological null singularities into timelike singularities hidden behind a finite area horizon. We establish, analytically and numerically, that the string-corrected two-charge black hole metric has the same Penrose diagram as the extremal four-charge black hole. The higher derivative terms lead to another dramatic effect -- the gravitational force exerted by a black hole on an inertial observer is no longer purely attractive! The magnitude of this effect is related to the size of the compactification manifold.

  19. Error Field Correction in ITER

    SciTech Connect

    Park, Jong-kyu; Boozer, Allen H.; Menard, Jonathan E.; Schaffer, Michael J.

    2008-05-22

    A new method for correcting magnetic field errors in the ITER tokamak is developed using the Ideal Perturbed Equilibrium Code (IPEC). The dominant external magnetic field for driving islands is shown to be localized to the outboard midplane for three ITER equilibria that represent the projected range of operational scenarios. The coupling matrices between the poloidal harmonics of the external magnetic perturbations and the resonant fields on the rational surfaces that drive islands are combined for different equilibria and used to determine an ordered list of the dominant errors in the external magnetic field. It is found that efficient and robust error field correction is possible with a fixed setting of the correction currents relative to the currents in the main coils across the range of ITER operating scenarios that was considered.

  20. Universality of quantum gravity corrections.

    PubMed

    Das, Saurya; Vagenas, Elias C

    2008-11-28

    We show that the existence of a minimum measurable length and the related generalized uncertainty principle (GUP), predicted by theories of quantum gravity, influence all quantum Hamiltonians. Thus, they predict quantum gravity corrections to various quantum phenomena. We compute such corrections to the Lamb shift, the Landau levels, and the tunneling current in a scanning tunneling microscope. We show that these corrections can be interpreted in two ways: (a) either that they are exceedingly small, beyond the reach of current experiments, or (b) that they predict upper bounds on the quantum gravity parameter in the GUP, compatible with experiments at the electroweak scale. Thus, more accurate measurements in the future should either be able to test these predictions, or further tighten the above bounds and predict an intermediate length scale between the electroweak and the Planck scale.

  1. Diagnostic and therapeutic management of hepatocellular carcinoma

    PubMed Central

    Bellissimo, Francesco; Pinzone, Marilia Rita; Cacopardo, Bruno; Nunnari, Giuseppe

    2015-01-01

    Hepatocellular carcinoma (HCC) is an increasing health problem, representing the second cause of cancer-related mortality worldwide. The major risk factor for HCC is cirrhosis. In developing countries, viral hepatitis represent the major risk factor, whereas in developed countries, the epidemic of obesity, diabetes and nonalcoholic steatohepatitis contribute to the observed increase in HCC incidence. Cirrhotic patients are recommended to undergo HCC surveillance by abdominal ultrasounds at 6-mo intervals. The current diagnostic algorithms for HCC rely on typical radiological hallmarks in dynamic contrast-enhanced imaging, while the use of α-fetoprotein as an independent tool for HCC surveillance is not recommended by current guidelines due to its low sensitivity and specificity. Early diagnosis is crucial for curative treatments. Surgical resection, radiofrequency ablation and liver transplantation are considered the cornerstones of curative therapy, while for patients with more advanced HCC recommended options include sorafenib and trans-arterial chemo-embolization. A multidisciplinary team, consisting of hepatologists, surgeons, radiologists, oncologists and pathologists, is fundamental for a correct management. In this paper, we review the diagnostic and therapeutic management of HCC, with a focus on the most recent evidences and recommendations from guidelines. PMID:26576088

  2. When correction turns positive: processing corrective prosody in Dutch.

    PubMed

    Dimitrova, Diana V; Stowe, Laurie A; Hoeks, John C J

    2015-01-01

    Current research on spoken language does not provide a consistent picture as to whether prosody, the melody and rhythm of speech, conveys a specific meaning. Perception studies show that English listeners assign meaning to prosodic patterns, and, for instance, associate some accents with contrast, whereas Dutch listeners behave more controversially. In two ERP studies we tested how Dutch listeners process words carrying two types of accents, which either provided new information (new information accents) or corrected information (corrective accents), both in single sentences (experiment 1) and after corrective and new information questions (experiment 2). In both experiments corrective accents elicited a sustained positivity as compared to new information accents, which started earlier in context than in single sentences. The positivity was not modulated by the nature of the preceding question, suggesting that the underlying neural mechanism likely reflects the construction of an interpretation to the accented word, either by identifying an alternative in context or by inferring it when no context is present. Our experimental results provide strong evidence for inferential processes related to prosodic contours in Dutch.

  3. When Correction Turns Positive: Processing Corrective Prosody in Dutch

    PubMed Central

    Dimitrova, Diana V.; Stowe, Laurie A.; Hoeks, John C. J.

    2015-01-01

    Current research on spoken language does not provide a consistent picture as to whether prosody, the melody and rhythm of speech, conveys a specific meaning. Perception studies show that English listeners assign meaning to prosodic patterns, and, for instance, associate some accents with contrast, whereas Dutch listeners behave more controversially. In two ERP studies we tested how Dutch listeners process words carrying two types of accents, which either provided new information (new information accents) or corrected information (corrective accents), both in single sentences (experiment 1) and after corrective and new information questions (experiment 2). In both experiments corrective accents elicited a sustained positivity as compared to new information accents, which started earlier in context than in single sentences. The positivity was not modulated by the nature of the preceding question, suggesting that the underlying neural mechanism likely reflects the construction of an interpretation to the accented word, either by identifying an alternative in context or by inferring it when no context is present. Our experimental results provide strong evidence for inferential processes related to prosodic contours in Dutch. PMID:25973607

  4. Correction.

    PubMed

    1992-12-11

    Last month, the U.S. Postal Service (USPS) prompted a 13 November Random Sample naming a group of scientists whose faces were appearing, USPS said, on stamps belonging to its Black Heritage Series. Among them: chemist Percy Lavon Julian; George Washington Carver; physician Charles R. Drew; astronomer and mathematician Benjamin Banneker; and inventor Jan Matzeliger. Science readers knew better. Two of the quintet appeared years ago: a stamp bearing Carver's picture was issued in 1948, and Drew appeared in the Great Americans Series in 1981. PMID:17831650

  5. Correction.

    PubMed

    2015-03-01

    In the January 2015 issue of Cyberpsychology, Behavior, and Social Networking (vol. 18, no. 1, pp. 3–7), the article "Individual Differences in Cyber Security Behaviors: An Examination of Who Is Sharing Passwords." by Prof. Monica Whitty et al., has an error in wording in the abstract. The sentence in question was originally printed as: Contrary to our hypotheses, we found older people and individuals who score high on self-monitoring were more likely to share passwords. It should read: Contrary to our hypotheses, we found younger people and individuals who score high on self-monitoring were more likely to share passwords. The authors wish to apologize for the error. PMID:25751054

  6. Correction.

    PubMed

    1992-12-11

    Last month, the U.S. Postal Service (USPS) prompted a 13 November Random Sample naming a group of scientists whose faces were appearing, USPS said, on stamps belonging to its Black Heritage Series. Among them: chemist Percy Lavon Julian; George Washington Carver; physician Charles R. Drew; astronomer and mathematician Benjamin Banneker; and inventor Jan Matzeliger. Science readers knew better. Two of the quintet appeared years ago: a stamp bearing Carver's picture was issued in 1948, and Drew appeared in the Great Americans Series in 1981.

  7. Correction

    NASA Astrophysics Data System (ADS)

    2009-12-01

    Due to an error in converting energy data from "quads" (one quadrillion, or 1015, British thermal units) to watt-hours, the opening paragraph of Grant's article contained several incorrect values for world energy consumption.

  8. Correction.

    PubMed

    1991-05-01

    Contrary to what we reported, the horned dinosaur Chasmosaurus (Science, 12 April, p. 207) did not have the largest skull of any land animal. Paleontologist Paul Sereno of the University of Chicago says that honor belongs to Triceratops, another member of the family Ceratopsidae.

  9. Correction.

    PubMed

    1991-11-29

    Because of a production error, the photographs of pierre Chambon and Harald zur Hausen, which appeared on pages 1116 and 1117 of last week's issue (22 November), were transposed. Here's what you should have seen: Chambon is on the left, zur Hausen on the right.

  10. Correction

    NASA Astrophysics Data System (ADS)

    2016-09-01

    The feature article “Neutrons for new drugs” (August pp26–29) stated that neutron crystallography was used to determine the structures of “wellknown complex biological molecules such as lysine, insulin and trypsin”.

  11. Corrections

    NASA Astrophysics Data System (ADS)

    2004-05-01

    1. The first photograph on p12 of News in Physics Educaton January 2004 is of Prof. Paul Black and not Prof. Jonathan Osborne, as stated. 2. The review of Flowlog on p209 of the March 2004 issue wrongly gives the maximum sampling rate of the analogue inputs as 25 kHz (40 ms) instead of 25 kHz (40 µs) and the digital inputs as 100 kHz (10 ms) instead of 100 kHz (10 µs). 3. The letter entitled 'A trial of two energies' by Eric McIldowie on pp212-4 of the March 2004 issue was edited to fit the space available. We regret that a few small errors were made in doing this. Rather than detail these, the interested reader can access the whole of the original letter as a Word file from the link below.

  12. Correction.

    PubMed

    2015-03-01

    In the January 2015 issue of Cyberpsychology, Behavior, and Social Networking (vol. 18, no. 1, pp. 3–7), the article "Individual Differences in Cyber Security Behaviors: An Examination of Who Is Sharing Passwords." by Prof. Monica Whitty et al., has an error in wording in the abstract. The sentence in question was originally printed as: Contrary to our hypotheses, we found older people and individuals who score high on self-monitoring were more likely to share passwords. It should read: Contrary to our hypotheses, we found younger people and individuals who score high on self-monitoring were more likely to share passwords. The authors wish to apologize for the error.

  13. Correction

    NASA Astrophysics Data System (ADS)

    2013-08-01

    In the 9 July issue of Eos, the feature "Peak Oil and Energy Independence: Myth and Reality"(Eos, 94(28), 245-246, doi:10.1002/2013EO280001) gave the price of natural gas in terms of dollars per Mcf and defined Mcf to be million cubic feet. However, Mcf means thousand cubic feet—the M comes from the Latin mille (thousand).

  14. Correction.

    PubMed

    1992-05-15

    In the 24 April "Inside AAAS" article "AAAS organizes more meetings of the mind" (p. 548), it is stated incorrectly that Paul Berg of Stanford University will be giving the keynote address and that Helen Donis-Keller of Washington University will be presenting a paper at the Science Innovation '92 meeting in San Francisco (21 to 25 July 1992). The Science Innovation '92 program was tentative at the time the article was written. Joseph Martin of the University of California, San Francisco, will deliver the keynote address on one of the major themes of the meeting, "Mapping the Human Brain." Helen Donis-Keller and Paul Berg were invited to speak but will not be on the program this year.

  15. Correction

    NASA Astrophysics Data System (ADS)

    1999-11-01

    Synsedimentary deformation in the Jurassic of southeastern Utah—A case of impact shaking? COMMENT Geology, v. 27, p. 661 (July 1999) The sentence on p. 661, first column, second paragraph, line one, should read: The 1600 m of Pennsylvania Paradox Formation is 75 90% salt in Arches National Park. The sentence on p. 661, second column, third paragraph, line seven, should read: This high-pressured ydrothermal solution created the clastic dikes, chert nodules from reprecipitated siliceous cement that have been called “siliceous impactites” (Kriens et al., 1997), and much of the present structure at Upheaval Dome by further faulting.

  16. Atmospheric Corrections in Coastal Altimetry

    NASA Astrophysics Data System (ADS)

    Antonita, Maria; Kumar, Raj

    2012-07-01

    The range measurements from the altimeter are associated with a large number of geophysical corrections which needs special attention near coasts and the shallow water regions. The corrections due to ionosphere, dry and wet troposphere and that due to sea state are of primary importance in altimetry. Water vapor dominates the wet tropospheric corrections by several factors which is more complex with higher spatio-temporal variations and thus needs a careful attention near coasts. In addition to this rain is one of the major atmospheric phenomena which attenuate the backscatter altimeter measurements which in turn affect the altimeter derived wind and wave measurements. Thus during rain events utmost care should be taken while deriving the altimeter wind speeds and wave heights. The first objective of the present study involves the comparison of the water vapor corrections estimated from radiosonde measurements near the coastal regions with the model estimated corrections applied in the altimeter range measurements. Analysis has been performed for the Coastal Altimeter products provided by the PISTACH to observe these corrections. The second objective is to estimate the rain rate using altimeter backscatter measurements. The differential attenuation of KU band over C band due to rain has been utilized to identify the rain events and to estimate the amount of rain fall. JASON-2 altimeter data during two tropical cyclonic events over Bay of Bengal have been used for this purpose. An attempt is made to compare the estimated rain rate from altimeter measurements with the other available collocated satellite observations like KALPANA and TRMM-TMI. The results are encouraging and can be used to provide valid rain flags in the altimeter products in addition to the radiometer rain flags.

  17. DARHT Radiographic Grid Scale Correction

    SciTech Connect

    Warthen, Barry J.

    2015-02-13

    Recently it became apparent that the radiographic grid which has been used to calibrate the dimensional scale of DARHT radiographs was not centered at the location where the objects have been centered. This offset produced an error of 0.188% in the dimensional scaling of the radiographic images processed using the assumption that the grid and objects had the same center. This paper will show the derivation of the scaling correction, explain how new radiographs are being processed to account for the difference in location, and provide the details of how to correct radiographic image processed with the erroneous scale factor.

  18. Anterior endoscopic correction of scoliosis.

    PubMed

    Picetti, George D; Ertl, Janos P; Bueff, H Ulrich

    2002-04-01

    Our technique of anterior endoscopic scoliosis correction demonstrates the ability to perform an anterior approach through a minimally invasive technique with minimal disruption of the local biology. The initial results appear to equal curve correction and fusion rates to those of a formal open anterior approach. Additional benefits are: 1) shortened operative time, 2) lower blood loss, 3) shortened rehabilitation time, 4) less pain, and 5) shortened hospital stays. Endoscopic technique shows great promise in the management of scoliosis curves; however, this is a technically demanding procedure that requires cross-training in endoscopic discectomy and scoliosis management as well as familiarity with the anterior approach anatomy. PMID:12389288

  19. Yoga school of thought and psychiatry: Therapeutic potential.

    PubMed

    Rao, Naren P; Varambally, Shivarama; Gangadhar, Bangalore N

    2013-01-01

    Yoga is a traditional life-style practice used for spiritual reasons. However, the physical components like the asanas and pranayaamas have demonstrated physiological and therapeutic effects. There is evidence for Yoga as being a potent antidepressant that matches with drugs. In depressive disorder, yoga 'corrects' an underlying cognitive physiology. In schizophrenia patients, yoga has benefits as an add-on intervention in pharmacologically stabilized subjects. The effects are particularly notable on negative symptoms. Yoga also helps to correct social cognition. Yoga can be introduced early in the treatment of psychosis with some benefits. Elevation of oxytocin may be a mechanism of yoga effects in schizophrenia. Certain components of yoga have demonstrated neurobiological effects similar to those of vagal stimulation, indicating this (indirect or autogenous vagal stimulation) as a possible mechanism of its action. It is time, psychiatrists exploited the benefits if yoga for a comprehensive care in their patients. PMID:23858245

  20. [Therapeutic strategy in cancer pain].

    PubMed

    Pagni, C A; Franzini, A

    1981-01-14

    Surgical and pharmacological management of cancer pain is described and discussed according to the physiopathological mechanisms underlying this complex syndrome. The therapeutic approach is planned in three mayor phases which may be employed alone or in combination, following an accurate evaluation of the pathophysiology and the clinical pattern in every single patient. The first phase includes multifocal pharmacological therapy by nonnarcotic drugs in order to affect at different levels the physiopathological mechanisms of cancer pain. The second phase is indicated when nonnarcotic drugs cannot achieve complete pain relief; neurosurgical procedures (nerve blocks, rhizotomies, cordotomies, ecc...) are employed in this phase. The pharmacological treatment must be continued and associated to surgery. The third phase includes hypophysectomy, deep brain stimulation, psychosurgery and/or narcotic drug therapy, which are the last step in management of terminal cancer pain when all treatments have been ineffective. The results of this therapeutic program in 188 patients affected by pain of malignant origin are reported and discussed.

  1. Sinigrin and Its Therapeutic Benefits.

    PubMed

    Mazumder, Anisha; Dwivedi, Anupma; du Plessis, Jeanetta

    2016-01-01

    Sinigrin (allyl-glucosinolate or 2-propenyl-glucosinolate) is a natural aliphatic glucosinolate present in plants of the Brassicaceae family, such as broccoli and brussels sprouts, and the seeds of Brassica nigra (mustard seeds) which contain high amounts of sinigrin. Since ancient times, mustard has been used by mankind for its culinary, as well as medicinal, properties. It has been systematically described and evaluated in the classical Ayurvedic texts. Studies conducted on the pharmacological activities of sinigrin have revealed anti-cancer, antibacterial, antifungal, antioxidant, anti-inflammatory, wound healing properties and biofumigation. This current review will bring concise information about the known therapeutic activities of sinigrin. However, the information on known biological activities is very limited and, hence, further studies still need to be conducted and its molecular mechanisms also need to be explored. This review on the therapeutic benefits of sinigrin can summarize current knowledge about this unique phytocompounds. PMID:27043505

  2. Therapeutic approaches for celiac disease

    PubMed Central

    Plugis, Nicholas M.; Khosla, Chaitan

    2015-01-01

    Celiac disease is a common, lifelong autoimmune disorder for which dietary control is the only accepted form of therapy. A strict gluten-free diet is burdensome to patients and can be limited in efficacy, indicating there is an unmet need for novel therapeutic approaches to supplement or supplant dietary therapy. Many molecular events required for disease pathogenesis have been recently characterized and inspire most current and emerging drug-discovery efforts. Genome-wide association studies (GWAS) confirm the importance of human leukocyte antigen genes in our pathogenic model and identify a number of new risk loci in this complex disease. Here, we review the status of both emerging and potential therapeutic strategies in the context of disease pathophysiology. We conclude with a discussion of how genes identified during GWAS and follow-up studies that enhance susceptibility may offer insight into developing novel therapies. PMID:26060114

  3. [Concept of the therapeutic community].

    PubMed

    Eichhorn, H

    1983-08-01

    The historic development of therapeutic communities is discussed, and it is shown that the term has been neither conceptualized not operationalized. Their unclear aims are considered to be utopian, and the author stresses that previous studies on such communities have been too superficial. The following problems have not hitherto received attention: 1. micro- and macrosocial relationships, 2. the role of the supervisor (authority problems), 3. norms and valuation systems, 4. discipline and sanctions, 5. the problem of roles, 6. questions of indicants and efficacy. The introduction of therapeutic communities is superfluous as a means of improving the socialist health services: it is sufficient to implement the principles of socialist democracy by means of appropriate training programmes. PMID:6635034

  4. [Therapeutic use of cannabis derivatives].

    PubMed

    Benyamina, Amine; Reynaud, Michel

    2014-02-01

    The therapeutic use of cannabis has generated a lot of interest in the past years, leading to a better understanding of its mechanisms of action. Countries like the United States and Canada have modified their laws in order to make cannabinoid use legal in the medical context. It's also the case in France now, where a recent decree was issued, authorizing the prescription of medication containing "therapeutic cannabis" (decree no. 2013-473, June 5, 2013). Cannabinoids such as dronabinol, Sativex and nabilone have been tested for the treatment of acute and chronic pain. These agents are most promising to relieve chronic pain associated with cancer, with human immunodeficiency virus infection and with multiple sclerosis. However, longer-term studies are required to determine potential long-term adverse effects and risks of misuse and addiction. PMID:24701869

  5. Translating connexin biology into therapeutics.

    PubMed

    Becker, David L; Phillips, Anthony R; Duft, Bradford J; Kim, Yeri; Green, Colin R

    2016-02-01

    It is 45 years since gap junctions were first described. Universities face increasing commercial pressures and declining federal funding, with governments and funding foundations showing greater interest in gaining return on their investments. This review outlines approaches taken to translate gap junction research to clinical application and the challenges faced. The need for commercialisation is discussed and key concepts behind research patenting briefly described. Connexin channel roles in disease and injury are also discussed, as is identification of the connexin hemichannel as a therapeutic target which appears to play a role in both the start and perpetuation of the inflammasome pathway. Furthermore connexin hemichannel opening results in vascular dieback in acute injury and chronic disease. Translation to human indications is illustrated from the perspective of one connexin biotechnology company, CoDa Therapeutics, Inc.

  6. Biased Randomized Algorithm for Fast Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Williams, Colin; Vartan, Farrokh

    2005-01-01

    A biased randomized algorithm has been developed to enable the rapid computational solution of a propositional- satisfiability (SAT) problem equivalent to a diagnosis problem. The closest competing methods of automated diagnosis are described in the preceding article "Fast Algorithms for Model-Based Diagnosis" and "Two Methods of Efficient Solution of the Hitting-Set Problem" (NPO-30584), which appears elsewhere in this issue. It is necessary to recapitulate some of the information from the cited articles as a prerequisite to a description of the present method. As used here, "diagnosis" signifies, more precisely, a type of model-based diagnosis in which one explores any logical inconsistencies between the observed and expected behaviors of an engineering system. The function of each component and the interconnections among all the components of the engineering system are represented as a logical system. Hence, the expected behavior of the engineering system is represented as a set of logical consequences. Faulty components lead to inconsistency between the observed and expected behaviors of the system, represented by logical inconsistencies. Diagnosis - the task of finding the faulty components - reduces to finding the components, the abnormalities of which could explain all the logical inconsistencies. One seeks a minimal set of faulty components (denoted a minimal diagnosis), because the trivial solution, in which all components are deemed to be faulty, always explains all inconsistencies. In the methods of the cited articles, the minimal-diagnosis problem is treated as equivalent to a minimal-hitting-set problem, which is translated from a combinatorial to a computational problem by mapping it onto the Boolean-satisfiability and integer-programming problems. The integer-programming approach taken in one of the prior methods is complete (in the sense that it is guaranteed to find a solution if one exists) and slow and yields a lower bound on the size of the

  7. Model-based cartilage thickness measurement in the submillimeter range

    SciTech Connect

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-09-15

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  8. Bioengineering Beige Adipose Tissue Therapeutics.

    PubMed

    Tharp, Kevin M; Stahl, Andreas

    2015-01-01

    Unlocking the therapeutic potential of brown/beige adipose tissue requires technological advancements that enable the controlled expansion of this uniquely thermogenic tissue. Transplantation of brown fat in small animal model systems has confirmed the expectation that brown fat expansion could possibly provide a novel therapeutic to combat obesity and related disorders. Expansion and/or stimulation of uncoupling protein-1 (UCP1)-positive adipose tissues have repeatedly demonstrated physiologically beneficial reductions in circulating glucose and lipids. The recent discovery that brown adipose tissue (BAT)-derived secreted factors positively alter whole body metabolism further expands potential benefits of brown or beige/brite adipose expansion. Unfortunately, there are no sources of transplantable BATs for human therapeutic purposes at this time. Recent developments in bioengineering, including novel hyaluronic acid-based hydrogels, have enabled non-immunogenic, functional tissue allografts that can be used to generate large quantities of UCP1-positive adipose tissue. These sophisticated tissue-engineering systems have provided the methodology to develop metabolically active brown or beige/brite adipose tissue implants with the potential to be used as a metabolic therapy. Unlike the pharmacological browning of white adipose depots, implantation of bioengineered UCP1-positive adipose tissues offers a spatially controlled therapeutic. Moving forward, new insights into the mechanisms by which extracellular cues govern stem-cell differentiation and progenitor cell recruitment may enable cell-free matrix implant approaches, which generate a niche sufficient to recruit white adipose tissue-derived stem cells and support their differentiation into functional beige/brite adipose tissues. This review summarizes clinically relevant discoveries in tissue-engineering and biology leading toward the recent development of biomaterial supported beige adipose tissue implants and

  9. Bioengineering Beige Adipose Tissue Therapeutics.

    PubMed

    Tharp, Kevin M; Stahl, Andreas

    2015-01-01

    Unlocking the therapeutic potential of brown/beige adipose tissue requires technological advancements that enable the controlled expansion of this uniquely thermogenic tissue. Transplantation of brown fat in small animal model systems has confirmed the expectation that brown fat expansion could possibly provide a novel therapeutic to combat obesity and related disorders. Expansion and/or stimulation of uncoupling protein-1 (UCP1)-positive adipose tissues have repeatedly demonstrated physiologically beneficial reductions in circulating glucose and lipids. The recent discovery that brown adipose tissue (BAT)-derived secreted factors positively alter whole body metabolism further expands potential benefits of brown or beige/brite adipose expansion. Unfortunately, there are no sources of transplantable BATs for human therapeutic purposes at this time. Recent developments in bioengineering, including novel hyaluronic acid-based hydrogels, have enabled non-immunogenic, functional tissue allografts that can be used to generate large quantities of UCP1-positive adipose tissue. These sophisticated tissue-engineering systems have provided the methodology to develop metabolically active brown or beige/brite adipose tissue implants with the potential to be used as a metabolic therapy. Unlike the pharmacological browning of white adipose depots, implantation of bioengineered UCP1-positive adipose tissues offers a spatially controlled therapeutic. Moving forward, new insights into the mechanisms by which extracellular cues govern stem-cell differentiation and progenitor cell recruitment may enable cell-free matrix implant approaches, which generate a niche sufficient to recruit white adipose tissue-derived stem cells and support their differentiation into functional beige/brite adipose tissues. This review summarizes clinically relevant discoveries in tissue-engineering and biology leading toward the recent development of biomaterial supported beige adipose tissue implants and

  10. Bioengineering Beige Adipose Tissue Therapeutics

    PubMed Central

    Tharp, Kevin M.; Stahl, Andreas

    2015-01-01

    Unlocking the therapeutic potential of brown/beige adipose tissue requires technological advancements that enable the controlled expansion of this uniquely thermogenic tissue. Transplantation of brown fat in small animal model systems has confirmed the expectation that brown fat expansion could possibly provide a novel therapeutic to combat obesity and related disorders. Expansion and/or stimulation of uncoupling protein-1 (UCP1)-positive adipose tissues have repeatedly demonstrated physiologically beneficial reductions in circulating glucose and lipids. The recent discovery that brown adipose tissue (BAT)-derived secreted factors positively alter whole body metabolism further expands potential benefits of brown or beige/brite adipose expansion. Unfortunately, there are no sources of transplantable BATs for human therapeutic purposes at this time. Recent developments in bioengineering, including novel hyaluronic acid-based hydrogels, have enabled non-immunogenic, functional tissue allografts that can be used to generate large quantities of UCP1-positive adipose tissue. These sophisticated tissue-engineering systems have provided the methodology to develop metabolically active brown or beige/brite adipose tissue implants with the potential to be used as a metabolic therapy. Unlike the pharmacological browning of white adipose depots, implantation of bioengineered UCP1-positive adipose tissues offers a spatially controlled therapeutic. Moving forward, new insights into the mechanisms by which extracellular cues govern stem-cell differentiation and progenitor cell recruitment may enable cell-free matrix implant approaches, which generate a niche sufficient to recruit white adipose tissue-derived stem cells and support their differentiation into functional beige/brite adipose tissues. This review summarizes clinically relevant discoveries in tissue-engineering and biology leading toward the recent development of biomaterial supported beige adipose tissue implants and

  11. Antibody Engineering and Therapeutics Conference

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Scott, Jamie; Larrick, James W; Plückthun, Andreas; Veldman, Trudi; Adams, Gregory P; Parren, Paul WHI; Chester, Kerry A; Bradbury, Andrew; Reichert, Janice M; Huston, James S

    2013-01-01

    The Antibody Engineering and Therapeutics conference, which serves as the annual meeting of The Antibody Society, will be held in Huntington Beach, CA from Sunday December 8 through Thursday December 12, 2013. The scientific program will cover the full spectrum of challenges in antibody research and development, and provide updates on recent progress in areas from basic science through approval of antibody therapeutics. Keynote presentations will be given by Leroy Hood (Institute of System Biology), who will discuss a systems approach for studying disease that is enabled by emerging technology; Douglas Lauffenburger (Massachusetts Institute of Technology), who will discuss systems analysis of cell communication network dynamics for therapeutic biologics design; David Baker (University of Washington), who will describe computer-based design of smart protein therapeutics; and William Schief (The Scripps Research Institute), who will discuss epitope-focused immunogen design.   In this preview of the conference, the workshop and session chairs share their thoughts on what conference participants may learn in sessions on: (1) three-dimensional structure antibody modeling; (2) identifying clonal lineages from next-generation data sets of expressed VH gene sequences; (3) antibodies in cardiometabolic medicine; (4) the effects of antibody gene variation and usage on the antibody response; (5) directed evolution; (6) antibody pharmacokinetics, distribution and off-target toxicity; (7) use of knowledge-based design to guide development of complementarity-determining regions and epitopes to engineer or elicit the desired antibody; (8) optimizing antibody formats for immunotherapy; (9) antibodies in a complex environment; (10) polyclonal, oligoclonal and bispecific antibodies; (11) antibodies to watch in 2014; and (12) polyreactive antibodies and polyspecificity.

  12. Yessotoxin, a Promising Therapeutic Tool

    PubMed Central

    Alfonso, Amparo; Vieytes, Mercedes R.; Botana, Luis M.

    2016-01-01

    Yessotoxin (YTX) is a polyether compound produced by dinoflagellates and accumulated in filter feeding shellfish. No records about human intoxications induced by this compound have been published, however it is considered a toxin. Modifications in second messenger levels, protein levels, immune cells, cytoskeleton or activation of different cellular death types have been published as consequence of YTX exposure. This review summarizes the main intracellular pathways modulated by YTX and their pharmacological and therapeutic implications. PMID:26828502

  13. Therapeutic perspectives in atopic dermatitis.

    PubMed

    Misery, Laurent

    2011-12-01

    Therapy of atopic dermatitis should comprise emollients, topical glucocorticosteroids, or calcineurin inhibitors, phototherapies, immunosuppressants like cyclosporin A, and other treatments. All these treatments should be improved, thanks to research. But new therapeutic perspectives should be given by topical anti-inflammatory substances, selective glucocorticoid receptor agonists, probiotics, interferon γ, TNFα inhibitors, inhibition of T cells or B cells, inhibition of IgE binding, and many other possibilities.

  14. Therapeutic apheresis in autoimmune diseases

    PubMed Central

    Bambauer, Rolf; Latza, Reinhard; Bambauer, Carolin; Burgard, Daniel; Schiel, Ralf

    2013-01-01

    Systemic autoimmune diseases based on an immune pathogenesis produce autoantibodies and circulating immune complexes, which cause inflammation in the tissues of various organs. In most cases, these diseases have a bad prognosis without treatment. Therapeutic apheresis in combination with immunosuppressive therapies has led to a steady increase in survival rates over the last 35 years. Here we provide an overview of the most important pathogenic aspects indicating that therapeutic apheresis can be a supportive therapy in some systemic autoimmune diseases, such as systemic lupus erythematosus, antiphospholipid syndrome, rheumatoid arthritis, and inflammatory eye disease. With the introduction of novel and effective biologic agents, therapeutic apheresis is indicated only in severe cases, such as in rapid progression despite immunosuppressive therapy and/or biologic agents, and in patients with renal involvement, acute generalized vasculitis, thrombocytopenia, leucopenia, pulmonary, cardiac, or cerebral involvement. In mild forms of autoimmune disease, treatment with immunosuppressive therapies and/or biologic agents seems to be sufficient. The prognosis of autoimmune diseases with varying organ manifestations has improved considerably in recent years, due in part to very aggressive therapy schemes.

  15. Copper complexes as therapeutic agents.

    PubMed

    Duncan, Clare; White, Anthony R

    2012-02-01

    The importance of transition metals in biological processes has been well established. Copper (Cu) is a transition metal that can exist in oxidised and reduced states. This allows it to participate in redox and catalytic chemistry, making it a suitable cofactor for a diverse range of enzymes and molecules. Cu deficiency or toxicity is implicated in a variety of pathological conditions; therefore inorganic complexes of Cu have been investigated for their therapeutic and diagnostic potential. These Cu complexes have been shown to be effective in cancer treatment due to their cytotoxic action on tumour cells. Alternatively, Cu complexes can also modulate Cu homeostasis in the brain, resulting in protective effects in several models of neurodegeneration. In other diseases such as coronary heart disease and skin disease, the success of Cu complexes as potential therapeutics will most likely be due to their ability to increase SOD activity, leading to relief of oxidative stress. This review seeks to provide a broad insight into some of the diverse actions of Cu complexes and demonstrate the strong future for these compounds as potential therapeutic agents.

  16. Therapeutic Applications of Carbon Monoxide

    PubMed Central

    Knauert, Melissa; Vangala, Sandeep; Haslip, Maria; Lee, Patty J.

    2013-01-01

    Heme oxygenase-1 (HO-1) is a regulated enzyme induced in multiple stress states. Carbon monoxide (CO) is a product of HO catalysis of heme. In many circumstances, CO appears to functionally replace HO-1, and CO is known to have endogenous anti-inflammatory, anti-apoptotic, and antiproliferative effects. CO is well studied in anoxia-reoxygenation and ischemia-reperfusion models and has advanced to phase II trials for treatment of several clinical entities. In alternative injury models, laboratories have used sepsis, acute lung injury, and systemic inflammatory challenges to assess the ability of CO to rescue cells, organs, and organisms. Hopefully, the research supporting the protective effects of CO in animal models will translate into therapeutic benefits for patients. Preclinical studies of CO are now moving towards more complex damage models that reflect polymicrobial sepsis or two-step injuries, such as sepsis complicated by acute respiratory distress syndrome. Furthermore, co-treatment and post-treatment with CO are being explored in which the insult occurs before there is an opportunity to intervene therapeutically. The aim of this review is to discuss the potential therapeutic implications of CO with a focus on lung injury and sepsis-related models. PMID:24648866

  17. Avian Diagnostic and Therapeutic Antibodies

    SciTech Connect

    Bradley, David Sherman

    2012-12-31

    A number of infectious agents have the potential of causing significant clinical symptomology and even death, but dispite this, the number of incidence remain below the level that supports producing a vaccine. Therapeutic antibodies provide a viable treatment option for many of these diseases. We proposed that antibodies derived from West Nile Virus (WNV) immunized geese would be able to treat WNV infection in mammals and potential humans. We demonstrated that WNV specific goose antibodies are indeed successful in treating WNV infection both prophylactically and therapeutically in a golden hamster model. We demonstrated that the goose derived antibodies are non-reactogenic, i.e. do not cause an inflammatory response with multiple exposures in mammals. We also developed both a specific pathogen free facility to house the geese during the antibody production phase and a patent-pending purification process to purify the antibodies to greater than 99% purity. Therefore, the success of these study will allow a cost effective rapidly producible therapeutic toward clinical testing with the necessary infrastructure and processes developed and in place.

  18. DNA as Therapeutics; an Update

    PubMed Central

    Saraswat, P.; Soni, R. R.; Bhandari, A.; Nagori, B. P.

    2009-01-01

    Human gene therapy is the introduction of new genetic material into the cells of an individual with the intention of producing a therapeutic benefit for the patient. Deoxyribonucleic acid and ribonucleic acid are used in gene therapy. Over time and with proper oversight, human gene therapy might become an effective weapon in modern medicine's arsenal to help fight diseases such as cancer, acquired immunodeficiency syndrome, diabetes, high blood pressure, coronary heart disease, peripheral vascular disease, neurodegenerative diseases, cystic fibrosis, hemophilia and other genetic disorders. Gene therapy trials in humans are of two types, somatic and germ line gene therapy. There are many ethical, social, and commercial issues raised by the prospects of treating patients whose consent is impossible to obtain. This review summarizes deoxyribonucleic acid-based therapeutics and gene transfer technologies for the diseases that are known to be genetic in origin. Deoxyribonucleic acid-based therapeutics includes plasmids, oligonucleotides for antisense and antigene applications, deoxyribonucleic acid aptamers and deoxyribonucleic acidzymes. This review also includes current status of gene therapy and recent developments in gene therapy research. PMID:20502565

  19. Tempest in a Therapeutic Community: Implementation and Evaluation Issues for Faith-Based Programming

    ERIC Educational Resources Information Center

    Scott, Diane L.; Crow, Matthew S.; Thompson, Carla J.

    2010-01-01

    The therapeutic community (TC) is an increasingly utilized intervention model in corrections settings. Rarely do these TCs include faith-based curriculum other than that included in Alcoholics Anonymous or Narcotics Anonymous programs as does the faith-based TC that serves as the basis for this article. Borrowing from the successful TC model, the…

  20. 77 FR 65002 - Best Pharmaceuticals for Children Act (BPCA) Priority List of Needs in Pediatric Therapeutics

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... multiple gaps in knowledge regarding the use of therapeutics in children, including the correct dose..., lack of knowledge related to the ethical conduct of clinical trials in ] children, the absence of... various non-profit and commercial organizations have taken steps to address the knowledge gaps that...

  1. ADMINISTRATIVE GUIDE IN SPEECH CORRECTION.

    ERIC Educational Resources Information Center

    HEALEY, WILLIAM C.

    WRITTEN PRIMARILY FOR SCHOOL SUPERINTENDENTS, PRINCIPALS, SPEECH CLINICIANS, AND SUPERVISORS, THIS GUIDE OUTLINES THE MECHANICS OF ORGANIZING AND CONDUCTING SPEECH CORRECTION ACTIVITIES IN THE PUBLIC SCHOOLS. IT INCLUDES THE REQUIREMENTS FOR CERTIFICATION OF A SPEECH CLINICIAN IN MISSOURI AND DESCRIBES ESSENTIAL STEPS FOR THE DEVELOPMENT OF A…

  2. Teaching Politically without Political Correctness.

    ERIC Educational Resources Information Center

    Graff, Gerald

    2000-01-01

    Discusses how to bring political issues into the classroom, highlighting the influence of local context and noting conservative and liberal criticisms of political correctness. Suggests the need for a different idea of how to teach politically from the advocacy pedagogy advanced by recent critical educators, explaining that bringing students into…

  3. The Politics of Political Correctness.

    ERIC Educational Resources Information Center

    Minsky, Leonard

    1992-01-01

    This article reacts to President Bush's entry into the dispute over "political correctness" on college campuses. The paper summarizes discussions of students, faculty, and others in the Washington, D.C. area which concluded that this seeming defense of free speech is actually an attack on affirmative action and multiculturalism stemming from the…

  4. Political Correctness and American Academe.

    ERIC Educational Resources Information Center

    Drucker, Peter F.

    1994-01-01

    Argues that today's political correctness atmosphere is a throwback to attempts made by the Nazis and Stalinists to force society into conformity. Academia, it is claimed, is being forced to conform to gain control of the institution of higher education. It is predicted that this effort will fail. (GR)

  5. Special Language and Political Correctness.

    ERIC Educational Resources Information Center

    Corbett, Jenny

    1994-01-01

    This article looks at the way in which the language used in relation to special education needs has changed and evolved since the 1960s, based on articles published in the British special education literature. Vocabulary, images, and attitudes are discussed in the context of political correctness and its impact on behavior. (DB)

  6. Terrain Corrections for Gravity Gradiometry

    NASA Astrophysics Data System (ADS)

    Huang, Ou

    This study developed a geostatistical method to determine the required extent of terrain corrections for gravity gradients under the criterion of different applications. We present the different methods to compute the terrain corrections for gravity gradients for the case of ground and airborne gravity gradiometry. In order to verify our geostatistical method and study the required extent for different types of terrain, we also developed a method to simulate topography based on the covariance model. The required extents were determined from the variance of truncation error for one point, or furthermore from the variance of truncation error difference for a pair of points, and these variances were verified with that from the deterministic method. The extent of terrain correction was determined for ground gradiometry based on simulated, ultra-high resolution topography for very local application, and also was determined based on mountainous topography of large areas. For airborne gradiometry, we compute the terrain corrections and the required extent based on Air-FTG observations at Vinton Dome, LA and Parkfield, CA area; also they were verified with the results of Bell Geospace. Finally, from the mostly flat, medium rough and mountainous areas, an empirical relationship was developed which has the properties that the required extent has 4 times relationship corresponding to the amplitude of PSD has 100 times relationship between mountainous and mostly flat areas, and it can be interpolated for other types of topography from their geostatistics.

  7. Correcting the AGS depolarizing resonances

    SciTech Connect

    Ratner, L.G.

    1986-01-01

    For the 1986 AGS run, the technique of correcting an imperfection resonance using a beat harmonic instead of the direct harmonic was applied and found to be useful in achieving a 22 GeV/c polarized beam. Both conventional and modified techniques are explained. (LEW)

  8. The correct "ball bearings" data.

    PubMed

    Caroni, C

    2002-12-01

    The famous data on fatigue failure times of ball bearings have been quoted incorrectly from Lieblein and Zelen's original paper. The correct data include censored values, as well as non-fatigue failures that must be handled appropriately. They could be described by a mixture of Weibull distributions, corresponding to different modes of failure.

  9. Model-based redesign of global transcription regulation

    PubMed Central

    Carrera, Javier; Rodrigo, Guillermo; Jaramillo, Alfonso

    2009-01-01

    Synthetic biology aims to the design or redesign of biological systems. In particular, one possible goal could be the rewiring of the transcription regulation network by exchanging the endogenous promoters. To achieve this objective, we have adapted current methods to the inference of a model based on ordinary differential equations that is able to predict the network response after a major change in its topology. Our procedure utilizes microarray data for training. We have experimentally validated our inferred global regulatory model in Escherichia coli by predicting transcriptomic profiles under new perturbations. We have also tested our methodology in silico by providing accurate predictions of the underlying networks from expression data generated with artificial genomes. In addition, we have shown the predictive power of our methodology by obtaining the gene profile in experimental redesigns of the E. coli genome, where rewiring the transcriptional network by means of knockouts of master regulators or by upregulating transcription factors controlled by different promoters. Our approach is compatible with most network inference methods, allowing to explore computationally future genome-wide redesign experiments in synthetic biology. PMID:19188257

  10. Model-based tomographic reconstruction of objects containing known components.

    PubMed

    Stayman, J Webster; Otake, Yoshito; Prince, Jerry L; Khanna, A Jay; Siewerdsen, Jeffrey H

    2012-10-01

    The likelihood of finding manufactured components (surgical tools, implants, etc.) within a tomographic field-of-view has been steadily increasing. One reason is the aging population and proliferation of prosthetic devices, such that more people undergoing diagnostic imaging have existing implants, particularly hip and knee implants. Another reason is that use of intraoperative imaging (e.g., cone-beam CT) for surgical guidance is increasing, wherein surgical tools and devices such as screws and plates are placed within or near to the target anatomy. When these components contain metal, the reconstructed volumes are likely to contain severe artifacts that adversely affect the image quality in tissues both near and far from the component. Because physical models of such components exist, there is a unique opportunity to integrate this knowledge into the reconstruction algorithm to reduce these artifacts. We present a model-based penalized-likelihood estimation approach that explicitly incorporates known information about component geometry and composition. The approach uses an alternating maximization method that jointly estimates the anatomy and the position and pose of each of the known components. We demonstrate that the proposed method can produce nearly artifact-free images even near the boundary of a metal implant in simulated vertebral pedicle screw reconstructions and even under conditions of substantial photon starvation. The simultaneous estimation of device pose also provides quantitative information on device placement that could be valuable to quality assurance and verification of treatment delivery.

  11. PARALLELISATION OF THE MODEL-BASED ITERATIVE RECONSTRUCTION ALGORITHM DIRA.

    PubMed

    Örtenberg, A; Magnusson, M; Sandborg, M; Alm Carlsson, G; Malusek, A

    2016-06-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelisation of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelisation of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelised using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelisation of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelisation with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained.

  12. Model-based optimization of tapered free-electron lasers

    NASA Astrophysics Data System (ADS)

    Mak, Alan; Curbis, Francesca; Werin, Sverker

    2015-04-01

    The energy extraction efficiency is a figure of merit for a free-electron laser (FEL). It can be enhanced by the technique of undulator tapering, which enables the sustained growth of radiation power beyond the initial saturation point. In the development of a single-pass x-ray FEL, it is important to exploit the full potential of this technique and optimize the taper profile aw(z ). Our approach to the optimization is based on the theoretical model by Kroll, Morton, and Rosenbluth, whereby the taper profile aw(z ) is not a predetermined function (such as linear or exponential) but is determined by the physics of a resonant particle. For further enhancement of the energy extraction efficiency, we propose a modification to the model, which involves manipulations of the resonant particle's phase. Using the numerical simulation code GENESIS, we apply our model-based optimization methods to a case of the future FEL at the MAX IV Laboratory (Lund, Sweden), as well as a case of the LCLS-II facility (Stanford, USA).

  13. Measuring neuronal branching patterns using model-based approach.

    PubMed

    Luczak, Artur

    2010-01-01

    Neurons have complex branching systems which allow them to communicate with thousands of other neurons. Thus understanding neuronal geometry is clearly important for determining connectivity within the network and how this shapes neuronal function. One of the difficulties in uncovering relationships between neuronal shape and its function is the problem of quantifying complex neuronal geometry. Even by using multiple measures such as: dendritic length, distribution of segments, direction of branches, etc, a description of three dimensional neuronal embedding remains incomplete. To help alleviate this problem, here we propose a new measure, a shape diffusiveness index (SDI), to quantify spatial relations between branches at the local and global scale. It was shown that growth of neuronal trees can be modeled by using diffusion limited aggregation (DLA) process. By measuring "how easy" it is to reproduce the analyzed shape by using the DLA algorithm it can be measured how "diffusive" is that shape. Intuitively, "diffusiveness" measures how tree-like is a given shape. For example shapes like an oak tree will have high values of SDI. This measure is capturing an important feature of dendritic tree geometry, which is difficult to assess with other measures. This approach also presents a paradigm shift from well-defined deterministic measures to model-based measures, which estimate how well a model with specific properties can account for features of analyzed shape. PMID:21079752

  14. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  15. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  16. Statistical shape model-based femur kinematics from biplane fluoroscopy.

    PubMed

    Baka, N; de Bruijne, M; van Walsum, T; Kaptein, B L; Giphart, J E; Schaap, M; Niessen, W J; Lelieveldt, B P F

    2012-08-01

    Studying joint kinematics is of interest to improve prosthesis design and to characterize postoperative motion. State of the art techniques register bones segmented from prior computed tomography or magnetic resonance scans with X-ray fluoroscopic sequences. Elimination of the prior 3D acquisition could potentially lower costs and radiation dose. Therefore, we propose to substitute the segmented bone surface with a statistical shape model based estimate. A dedicated dynamic reconstruction and tracking algorithm was developed estimating the shape based on all frames, and pose per frame. The algorithm minimizes the difference between the projected bone contour and image edges. To increase robustness, we employ a dynamic prior, image features, and prior knowledge about bone edge appearances. This enables tracking and reconstruction from a single initial pose per sequence. We evaluated our method on the distal femur using eight biplane fluoroscopic drop-landing sequences. The proposed dynamic prior and features increased the convergence rate of the reconstruction from 71% to 91%, using a convergence limit of 3 mm. The achieved root mean square point-to-surface accuracy at the converged frames was 1.48 ± 0.41 mm. The resulting tracking precision was 1-1.5 mm, with the largest errors occurring in the rotation around the femoral shaft (about 2.5° precision).

  17. Model-Based Reasoning in Upper-division Lab Courses

    NASA Astrophysics Data System (ADS)

    Lewandowski, Heather

    2015-05-01

    Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.

  18. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  19. An Opinion Interactive Model Based on Individual Persuasiveness.

    PubMed

    Zhou, Xin; Chen, Bin; Liu, Liang; Ma, Liang; Qiu, Xiaogang

    2015-01-01

    In order to study the formation process of group opinion in real life, we put forward a new opinion interactive model based on Deffuant model and its improved models in this paper because current models of opinion dynamics lack considering individual persuasiveness. Our model has following advantages: firstly persuasiveness is added to individual's attributes reflecting the importance of persuasiveness, which means that all the individuals are different from others; secondly probability is introduced in the course of interaction which simulates the uncertainty of interaction. In Monte Carlo simulation experiments, sensitivity analysis including the influence of randomness, initial persuasiveness distribution, and number of individuals is studied at first; what comes next is that the range of common opinion based on the initial persuasiveness distribution can be predicted. Simulation experiment results show that when the initial values of agents are fixed, no matter how many times independently replicated experiments, the common opinion will converge at a certain point; however the number of iterations will not always be the same; the range of common opinion can be predicted when initial distribution of opinion and persuasiveness are given. As a result, this model can reflect and interpret some phenomena of opinion interaction in realistic society.

  20. PARALLELISATION OF THE MODEL-BASED ITERATIVE RECONSTRUCTION ALGORITHM DIRA.

    PubMed

    Örtenberg, A; Magnusson, M; Sandborg, M; Alm Carlsson, G; Malusek, A

    2016-06-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelisation of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelisation of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelised using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelisation of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelisation with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. PMID:26454270