Science.gov

Sample records for model-based therapeutic correction

  1. Model-Based Therapeutic Correction of Hypothalamic-Pituitary-Adrenal Axis Dysfunction

    PubMed Central

    Ben-Zvi, Amos; Vernon, Suzanne D.; Broderick, Gordon

    2009-01-01

    The hypothalamic-pituitary-adrenal (HPA) axis is a major system maintaining body homeostasis by regulating the neuroendocrine and sympathetic nervous systems as well modulating immune function. Recent work has shown that the complex dynamics of this system accommodate several stable steady states, one of which corresponds to the hypocortisol state observed in patients with chronic fatigue syndrome (CFS). At present these dynamics are not formally considered in the development of treatment strategies. Here we use model-based predictive control (MPC) methodology to estimate robust treatment courses for displacing the HPA axis from an abnormal hypocortisol steady state back to a healthy cortisol level. This approach was applied to a recent model of HPA axis dynamics incorporating glucocorticoid receptor kinetics. A candidate treatment that displays robust properties in the face of significant biological variability and measurement uncertainty requires that cortisol be further suppressed for a short period until adrenocorticotropic hormone levels exceed 30% of baseline. Treatment may then be discontinued, and the HPA axis will naturally progress to a stable attractor defined by normal hormone levels. Suppression of biologically available cortisol may be achieved through the use of binding proteins such as CBG and certain metabolizing enzymes, thus offering possible avenues for deployment in a clinical setting. Treatment strategies can therefore be designed that maximally exploit system dynamics to provide a robust response to treatment and ensure a positive outcome over a wide range of conditions. Perhaps most importantly, a treatment course involving further reduction in cortisol, even transient, is quite counterintuitive and challenges the conventional strategy of supplementing cortisol levels, an approach based on steady-state reasoning. PMID:19165314

  2. The application criterion of model-based optical proximity correction in a low k1 process

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Youl; Kim, In-Sung; Jung, Sung-Gon; Jung, Myoung-Ho; Park, Joo-On; Oh, Seok-Hwan; Woo, Sang-Gyun; Cho, Han-Ku; Moon, Joo-Tae

    2005-05-01

    As k1 factor approaches the theoretical limit, optical proximity correction (OPC) treatments necessary to maintain dimensional tolerances involve increasingly complex correction shapes. This translates to more detailed, or larger mask pattern databases. Moreover, development of exposure tools lags behind the shrinkage of device. This may result in dwindling of process margin in lighographic process despite using all possible resolution enhancement techniques (RETs). Although model-based OPC may lose its effectiveness in case of narrower photolithographic process margin, model-based OPC is recognized as a robust tool to cope with the diversity of layout. By the way, in case of narrower photolithographic process margin, model-based OPC lose its effectiveness. To enhance the usefulness of the OPC, we need to overcome many obstacles. It is supposed that the original layout be designed friendly to lithography to enhance the process margin using aggressive RETs, and is amended by model-based OPC to suppress the proximity effect. But, some constraints are found during an OPC procedure. Ultimately, unless the original lithgraphy friendly layout (LFL) is corrected in terms of pitches and shapes, the lithography process is out of process window as well as makes pattern fidelity poor. This paper emphasizes that the application of model-based OPC requires a particular and unique layout configuration to preserve the process margin in the low k1 process.

  3. Quantitative fully 3D PET via model-based scatter correction

    SciTech Connect

    Ollinger, J.M.

    1994-05-01

    We have investigated the quantitative accuracy of fully 3D PET using model-based scatter correction by measuring the half-life of Ga-68 in the presence of scatter from F-18. The inner chamber of a Data Spectrum cardiac phantom was filled with 18.5 MBq of Ga-68. The outer chamber was filled with an equivalent amount of F-18. The cardiac phantom was placed in a 22x30.5 cm elliptical phantom containing anthropomorphic lung inserts filled with a water-Styrofoam mixture. Ten frames of dynamic data were collected over 13.6 hours on Siemens-CTI 953B scanner with the septa retracted. The data were corrected using model-based scatter correction, which uses the emission images, transmission images and an accurate physical model to directly calculate the scatter distribution. Both uncorrected and corrected data were reconstructed using the Promis algorithm. The scatter correction required 4.3% of the total reconstruction time. The scatter fraction in a small volume of interest in the center of the inner chamber of the cardiac insert rose from 4.0% in the first interval to 46.4% in the last interval as the ratio of F-18 activity to Ga-68 activity rose from 1:1 to 33:1. Fitting a single exponential to the last three data points yields estimates of the half-life of Ga-68 of 77.01 minutes and 68.79 minutes for uncorrected and corrected data respectively. Thus, scatter correction reduces the error from 13.3% to 1.2%. This suggests that model-based scatter correction is accurate in the heterogeneous attenuating medium found in the chest, making possible quantitative, fully 3D PET in the body.

  4. Reduction of large set data transmission using algorithmically corrected model-based techniques for bandwidth efficiency

    NASA Astrophysics Data System (ADS)

    Khair, Joseph Daniel

    Communication requirements and demands on deployed systems are increasing daily. This increase is due to the desire for more capability, but also, due to the changing landscape of threats on remote vehicles. As such, it is important that we continue to find new and innovative ways to transmit data to and from these remote systems, consistent with this changing landscape. Specifically, this research shows that data can be transmitted to a remote system effectively and efficiently with a model-based approach using real-time updates, called Algorithmically Corrected Model-based Technique (ACMBT), resulting in substantial savings in communications overhead. To demonstrate this model-based data transmission technique, a hardware-based test fixture was designed and built. Execution and analysis software was created to perform a series of characterizations demonstrating the effectiveness of the new transmission method. The new approach was compared to a traditional transmission approach in the same environment, and the results were analyzed and presented. A Figure of Merit (FOM) was devised and presented to allow standardized comparison of traditional and proposed data transmission methodologies alongside bandwidth utilization metrics. The results of this research have successfully shown the model-based technique to be feasible. Additionally, this research has opened the trade space for future discussion and implementation of this technique.

  5. Autoregressive model based algorithm for correcting motion and serially correlated errors in fNIRS

    PubMed Central

    Barker, Jeffrey W.; Aarabi, Ardalan; Huppert, Theodore J.

    2013-01-01

    Systemic physiology and motion-induced artifacts represent two major sources of confounding noise in functional near infrared spectroscopy (fNIRS) imaging that can reduce the performance of analyses and inflate false positive rates (i.e., type I errors) of detecting evoked hemodynamic responses. In this work, we demonstrated a general algorithm for solving the general linear model (GLM) for both deconvolution (finite impulse response) and canonical regression models based on designing optimal pre-whitening filters using autoregressive models and employing iteratively reweighted least squares. We evaluated the performance of the new method by performing receiver operating characteristic (ROC) analyses using synthetic data, in which serial correlations, motion artifacts, and evoked responses were controlled via simulations, as well as using experimental data from children (3–5 years old) as a source baseline physiological noise and motion artifacts. The new method outperformed ordinary least squares (OLS) with no motion correction, wavelet based motion correction, or spline interpolation based motion correction in the presence of physiological and motion related noise. In the experimental data, false positive rates were as high as 37% when the estimated p-value was 0.05 for the OLS methods. The false positive rate was reduced to 5–9% with the proposed method. Overall, the method improves control of type I errors and increases performance when motion artifacts are present. PMID:24009999

  6. Automated model-based bias field correction of MR images of the brain.

    PubMed

    Van Leemput, K; Maes, F; Vandermeulen, D; Suetens, P

    1999-10-01

    We propose a model-based method for fully automated bias field correction of MR brain images. The MR signal is modeled as a realization of a random process with a parametric probability distribution that is corrupted by a smooth polynomial inhomogeneity or bias field. The method we propose applies an iterative expectation-maximization (EM) strategy that interleaves pixel classification with estimation of class distribution and bias field parameters, improving the likelihood of the model parameters at each iteration. The algorithm, which can handle multichannel data and slice-by-slice constant intensity offsets, is initialized with information from a digital brain atlas about the a priori expected location of tissue classes. This allows full automation of the method without need for user interaction, yielding more objective and reproducible results. We have validated the bias correction algorithm on simulated data and we illustrate its performance on various MR images with important field inhomogeneities. We also relate the proposed algorithm to other bias correction algorithms. PMID:10628948

  7. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  8. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance. PMID:25532366

  9. Correcting encoder interpolation error on the Green Bank Telescope using an iterative model based identification algorithm

    NASA Astrophysics Data System (ADS)

    Franke, Timothy; Weadon, Tim; Ford, John; Garcia-Sanz, Mario

    2015-10-01

    Various forms of measurement errors limit telescope tracking performance in practice. A new method for identifying the correcting coefficients for encoder interpolation error is developed. The algorithm corrects the encoder measurement by identifying a harmonic model of the system and using that model to compute the necessary correction parameters. The approach improves upon others by explicitly modeling the unknown dynamics of the structure and controller and by not requiring a separate system identification to be performed. Experience gained from pin-pointing the source of encoder error on the Green Bank Radio Telescope (GBT) is presented. Several tell-tale indicators of encoder error are discussed. Experimental data from the telescope, tested with two different encoders, are presented. Demonstration of the identification methodology on the GBT as well as details of its implementation are discussed. A root mean square tracking error reduction from 0.68 arc seconds to 0.21 arc sec was achieved by changing encoders and was further reduced to 0.10 arc sec with the calibration algorithm. In particular, the ubiquity of this error source is shown and how, by careful correction, it is possible to go beyond the advertised accuracy of an encoder.

  10. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  11. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  12. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  13. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking

    PubMed Central

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  14. Dixon sequence with superimposed model-based bone compartment provides highly accurate PET/MR attenuation correction of the brain

    PubMed Central

    Koesters, Thomas; Friedman, Kent P.; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Babb, James; Jelescu, Ileana O.; Faul, David; Boada, Fernando E.; Shepherd, Timothy M.

    2016-01-01

    Simultaneous PET/MR of the brain is a promising new technology for characterizing patients with suspected cognitive impairment or epilepsy. Unlike CT though, MR signal intensities do not provide a direct correlate to PET photon attenuation correction (AC) and inaccurate radiotracer standard uptake value (SUV) estimation could limit future PET/MR clinical applications. We tested a novel AC method that supplements standard Dixon-based tissue segmentation with a superimposed model-based bone compartment. Methods We directly compared SUV estimation for MR-based AC methods to reference CT AC in 16 patients undergoing same-day, single 18FDG dose PET/CT and PET/MR for suspected neurodegeneration. Three Dixon-based MR AC methods were compared to CT – standard Dixon 4-compartment segmentation alone, Dixon with a superimposed model-based bone compartment, and Dixon with a superimposed bone compartment and linear attenuation correction optimized specifically for brain tissue. The brain was segmented using a 3D T1-weighted volumetric MR sequence and SUV estimations compared to CT AC for whole-image, whole-brain and 91 FreeSurfer-based regions-of-interest. Results Modifying the linear AC value specifically for brain and superimposing a model-based bone compartment reduced whole-brain SUV estimation bias of Dixon-based PET/MR AC by 95% compared to reference CT AC (P < 0.05) – this resulted in a residual −0.3% whole-brain mean SUV bias. Further, brain regional analysis demonstrated only 3 frontal lobe regions with SUV estimation bias of 5% or greater (P < 0.05). These biases appeared to correlate with high individual variability in the frontal bone thickness and pneumatization. Conclusion Bone compartment and linear AC modifications result in a highly accurate MR AC method in subjects with suspected neurodegeneration. This prototype MR AC solution appears equivalent than other recently proposed solutions, and does not require additional MR sequences and scan time. These

  15. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  16. Efficient model-based dummy-fill OPC correction flow for deep sub-micron technology nodes

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Salama, Mohamed

    2014-09-01

    Dummy fill insertion is a necessary step in modern semiconductor technologies to achieve homogeneous pattern density per layer. This benefits several fabrication process steps including but not limited to Chemical Mechanical Polishing (CMP), Etching, and Packaging. As the technology keeps shrinking, fill shapes become more challenging to pattern and require aggressive model based optical proximity correction (MBOPC) to achieve better design fidelity. MBOPC on Fill is a challenge to mask data prep runtime and final mask shot count which would affect the total turnaround time (TAT) and mask cost. In our work, we introduce a novel flow that achieves a robust and computationally efficient fill handling methodology during mask data prep, which will keep both the runtime and shot count within their acceptable levels. In this flow, fill shapes undergo a smart MBOPC step which improves the final wafer printing quality and topography uniformity without degrading the final shot count or the OPC cycle runtime. This flow is tested on both front end of line (FEOL) layers and backend of line (BEOL) layers, and results in an improved final printing of the fill patterns while consuming less than 2% of the full MBOPC flow runtime.

  17. [Therapeutic correction of mild cognitive impairment in patients with chronic cerebral ischemia].

    PubMed

    Odinak, M M; Kashin, A V; Ememlin, A Iu; Lupanov, I A

    2013-01-01

    Neurodegenerative and cerebrovascular diseases are the most significant among the main reasons leading to the cognitive impairment of the elderly. Vascular cognitive impairment is not limited to only dementia, representing a heterogeneous group both in pathogenic and clinical terms. The article dwells upon new principles of vascular cognitive impairment's classification and the review of their possible therapeutic correction that was conducted. The article includes the results of the 12-week open therapeutic (randomized with the control group) study of efficiency and safety of vitrum memory for patients with mild vascular cognitive impairment. It is shown that the therapy significantly improved the state of neurodynamic and regulatory functions of the patients with I--II stage dyscirculatory encephalopathy. PMID:23739499

  18. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    PubMed Central

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  19. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities.

    PubMed

    Warren, Keith L; Doogan, Nathan; De Leon, George; Phillips, Gary S; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  20. A three-dimensional model-based partial volume correction strategy for gated cardiac mouse PET imaging

    NASA Astrophysics Data System (ADS)

    Dumouchel, Tyler; Thorn, Stephanie; Kordos, Myra; DaSilva, Jean; Beanlands, Rob S. B.; deKemp, Robert A.

    2012-07-01

    Quantification in cardiac mouse positron emission tomography (PET) imaging is limited by the imaging spatial resolution. Spillover of left ventricle (LV) myocardial activity into adjacent organs results in partial volume (PV) losses leading to underestimation of myocardial activity. A PV correction method was developed to restore accuracy of the activity distribution for FDG mouse imaging. The PV correction model was based on convolving an LV image estimate with a 3D point spread function. The LV model was described regionally by a five-parameter profile including myocardial, background and blood activities which were separated into three compartments by the endocardial radius and myocardium wall thickness. The PV correction was tested with digital simulations and a physical 3D mouse LV phantom. In vivo cardiac FDG mouse PET imaging was also performed. Following imaging, the mice were sacrificed and the tracer biodistribution in the LV and liver tissue was measured using a gamma-counter. The PV correction algorithm improved recovery from 50% to within 5% of the truth for the simulated and measured phantom data and image uniformity by 5-13%. The PV correction algorithm improved the mean myocardial LV recovery from 0.56 (0.54) to 1.13 (1.10) without (with) scatter and attenuation corrections. The mean image uniformity was improved from 26% (26%) to 17% (16%) without (with) scatter and attenuation corrections applied. Scatter and attenuation corrections were not observed to significantly impact PV-corrected myocardial recovery or image uniformity. Image-based PV correction algorithm can increase the accuracy of PET image activity and improve the uniformity of the activity distribution in normal mice. The algorithm may be applied using different tracers, in transgenic models that affect myocardial uptake, or in different species provided there is sufficient image quality and similar contrast between the myocardium and surrounding structures.

  1. Model-Based Assessment of Plasma Citrate Flux Into the Liver: Implications for NaCT as a Therapeutic Target.

    PubMed

    Li, Z; Erion, D M; Maurer, T S

    2016-03-01

    Cytoplasmic citrate serves as an important regulator of gluconeogenesis and carbon source for de novo lipogenesis in the liver. For this reason, the sodium-coupled citrate transporter (NaCT), a plasma membrane transporter that governs hepatic influx of plasma citrate in human, is being explored as a potential therapeutic target for metabolic disorders. As cytoplasmic citrate also originates from intracellular mitochondria, the relative contribution of these two pathways represents critical information necessary to underwrite confidence in this target. In this work, hepatic influx of plasma citrate was quantified via pharmacokinetic modeling of published clinical data. The influx was then compared to independent literature estimates of intracellular citrate flux in human liver. The results indicate that, under normal conditions, <10% of hepatic citrate originates from plasma. Similar estimates were determined experimentally in mice and rats. This suggests that NaCT inhibition will have a limited impact on hepatic citrate concentrations across species. PMID:27069776

  2. Evaluation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound.

    PubMed

    Clements, Logan W; Collins, Jarrod A; Weis, Jared A; Simpson, Amber L; Adams, Lauryn B; Jarnagin, William R; Miga, Michael I

    2016-01-01

    Soft-tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface-based metrics, and subsurface validation has largely been performed via phantom experiments. The proposed method involves the analysis of two deformation-correction algorithms for open hepatic image-guided surgery systems via subsurface targets digitized with tracked intraoperative ultrasound (iUS). Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration and for use in retrospective deformation-correction algorithms. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. Mean closest-point distances between the feature contours delineated in the iUS images and corresponding three-dimensional anatomical model generated from preoperative tomograms were computed to quantify the extent to which the deformation-correction algorithms improved registration accuracy. The results for six patients, including eight anatomical targets, indicate that deformation correction can facilitate reduction in target error of [Formula: see text]. PMID:27081664

  3. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  4. Model-based correction of tissue compression for tracked ultrasound in soft tissue image-guided surgery.

    PubMed

    Pheiffer, Thomas S; Thompson, Reid C; Rucker, Daniel C; Simpson, Amber L; Miga, Michael I

    2014-04-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm. PMID:24412172

  5. MODEL-BASED CORRECTION OF TISSUE COMPRESSION FOR TRACKED ULTRASOUND IN SOFT TISSUE IMAGE-GUIDED SURGERY

    PubMed Central

    Pheiffer, Thomas S.; Thompson, Reid C.; Rucker, Daniel C.; Simpson, Amber L.; Miga, Michael I.

    2014-01-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm. PMID:24412172

  6. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  7. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in theNOTCH3gene.NOTCH3mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specificNOTCH3exons. Selection of these exons was achieved usingin silicostudies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targetedNOTCH3exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations. Transfection of

  8. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    SciTech Connect

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for building parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the

  9. Whole-Body PET/MR Imaging: Quantitative Evaluation of a Novel Model-Based MR Attenuation Correction Method Including Bone

    PubMed Central

    Paulus, Daniel H.; Quick, Harald H.; Geppert, Christian; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Faul, David; Boada, Fernando; Friedman, Kent P.; Koesters, Thomas

    2016-01-01

    In routine whole-body PET/MR hybrid imaging, attenuation correction (AC) is usually performed by segmentation methods based on a Dixon MR sequence providing up to 4 different tissue classes. Because of the lack of bone information with the Dixon-based MR sequence, bone is currently considered as soft tissue. Thus, the aim of this study was to evaluate a novel model-based AC method that considers bone in whole-body PET/MR imaging. Methods The new method (“Model”) is based on a regular 4-compartment segmentation from a Dixon sequence (“Dixon”). Bone information is added using a model-based bone segmentation algorithm, which includes a set of prealigned MR image and bone mask pairs for each major body bone individually. Model was quantitatively evaluated on 20 patients who underwent whole-body PET/MR imaging. As a standard of reference, CT-based μ-maps were generated for each patient individually by nonrigid registration to the MR images based on PET/CT data. This step allowed for a quantitative comparison of all μ-maps based on a single PET emission raw dataset of the PET/MR system. Volumes of interest were drawn on normal tissue, soft-tissue lesions, and bone lesions; standardized uptake values were quantitatively compared. Results In soft-tissue regions with background uptake, the average bias of SUVs in background volumes of interest was 2.4% ± 2.5% and 2.7% ± 2.7% for Dixon and Model, respectively, compared with CT-based AC. For bony tissue, the −25.5% ± 7.9% underestimation observed with Dixon was reduced to −4.9% ± 6.7% with Model. In bone lesions, the average underestimation was −7.4% ± 5.3% and −2.9% ± 5.8% for Dixon and Model, respectively. For soft-tissue lesions, the biases were 5.1% ± 5.1% for Dixon and 5.2% ± 5.2% for Model. Conclusion The novel MR-based AC method for whole-body PET/MR imaging, combining Dixon-based soft-tissue segmentation and model-based bone estimation, improves PET quantification in whole-body hybrid PET

  10. Correction.

    PubMed

    2015-11-01

    In the article by Heuslein et al, which published online ahead of print on September 3, 2015 (DOI: 10.1161/ATVBAHA.115.305775), a correction was needed. Brett R. Blackman was added as the penultimate author of the article. The article has been corrected for publication in the November 2015 issue. PMID:26490278

  11. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera

    NASA Astrophysics Data System (ADS)

    Holstensson, M.; Erlandsson, K.; Poludniowski, G.; Ben-Haim, S.; Hutton, B. F.

    2015-04-01

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of 99mTc and 123I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of 99mTc and 123I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both 99mTc and 123I, results show that the estimated spatial distribution of events from 99mTc in the 99mTc photopeak energy window is very similar to that measured in a single 99mTc phantom study. The extracted images of primary events display increased cold lesion contrasts for both 99mTc and 123I.

  12. Correction.

    PubMed

    2015-12-01

    In the article by Narayan et al (Narayan O, Davies JE, Hughes AD, Dart AM, Parker KH, Reid C, Cameron JD. Central aortic reservoir-wave analysis improves prediction of cardiovascular events in elderly hypertensives. Hypertension. 2015;65:629–635. doi: 10.1161/HYPERTENSIONAHA.114.04824), which published online ahead of print December 22, 2014, and appeared in the March 2015 issue of the journal, some corrections were needed.On page 632, Figure, panel A, the label PRI has been corrected to read RPI. In panel B, the text by the upward arrow, "10% increase in kd,” has been corrected to read, "10% decrease in kd." The corrected figure is shown below.The authors apologize for these errors. PMID:26558821

  13. Correction

    NASA Astrophysics Data System (ADS)

    1995-04-01

    Seismic images of the Brooks Range, Arctic Alaska, reveal crustal-scale duplexing: Correction Geology, v. 23, p. 65 68 (January 1995) The correct Figure 4A, for the loose insert, is given here. See Figure 4A below. Corrected inserts will be available to those requesting copies of the article from the senior author, Gary S. Fuis, U.S. Geological Survey, 345 Middlefield Road, Menlo Park, CA 94025. Figure 4A. P-wave velocity model of Brooks Range region (thin gray contours) with migrated wide-angle reflections (heavy red lines) and migreated vertical-incidence reflections (short black lines) superimposed. Velocity contour interval is 0.25 km/s; 4,5, and 6 km/s contours are labeled. Estimated error in velocities is one contour interval. Symbols on faults shown at top are as in Figure 2 caption.

  14. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera.

    PubMed

    Holstensson, M; Erlandsson, K; Poludniowski, G; Ben-Haim, S; Hutton, B F

    2015-04-21

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of (99m)Tc and (123)I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of (99m)Tc and (123)I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both (99m)Tc and (123)I, results show that the estimated spatial distribution of events from (99m)Tc in the (99m)Tc photopeak energy window is very similar to that measured in a single (99m)Tc phantom study. The extracted images of primary events display increased cold lesion contrasts for both (99m)Tc and (123)I. PMID:25803643

  15. Correction.

    PubMed

    2016-02-01

    Neogi T, Jansen TLTA, Dalbeth N, et al. 2015 Gout classification criteria: an American College of Rheumatology/European League Against Rheumatism collaborative initiative. Ann Rheum Dis 2015;74:1789–98. The name of the 20th author was misspelled. The correct spelling is Janitzia Vazquez-Mellado. We regret the error. PMID:26881284

  16. Correction.

    PubMed

    2016-02-01

    In the article by Guessous et al (Guessous I, Pruijm M, Ponte B, Ackermann D, Ehret G, Ansermot N, Vuistiner P, Staessen J, Gu Y, Paccaud F, Mohaupt M, Vogt B, Pechère-Bertschi A, Martin PY, Burnier M, Eap CB, Bochud M. Associations of ambulatory blood pressure with urinary caffeine and caffeine metabolite excretions. Hypertension. 2015;65:691–696. doi: 10.1161/HYPERTENSIONAHA.114.04512), which published online ahead of print December 8, 2014, and appeared in the March 2015 issue of the journal, a correction was needed.One of the author surnames was misspelled. Antoinette Pechère-Berstchi has been corrected to read Antoinette Pechère-Bertschi.The authors apologize for this error. PMID:26763012

  17. Gene transfer corrects acute GM2 gangliosidosis--potential therapeutic contribution of perivascular enzyme flow.

    PubMed

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-08-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay-Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity-as opposed to tremor-ataxia-were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue-long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  18. Gene Transfer Corrects Acute GM2 Gangliosidosis—Potential Therapeutic Contribution of Perivascular Enzyme Flow

    PubMed Central

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-01-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay–Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity—as opposed to tremor-ataxia—were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue—long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  19. Correction.

    PubMed

    2015-05-22

    The Circulation Research article by Keith and Bolli (“String Theory” of c-kitpos Cardiac Cells: A New Paradigm Regarding the Nature of These Cells That May Reconcile Apparently Discrepant Results. Circ Res. 2015:116:1216-1230. doi: 10.1161/CIRCRESAHA.116.305557) states that van Berlo et al (2014) observed that large numbers of fibroblasts and adventitial cells, some smooth muscle and endothelial cells, and rare cardiomyocytes originated from c-kit positive progenitors. However, van Berlo et al reported that only occasional fibroblasts and adventitial cells derived from c-kit positive progenitors in their studies. Accordingly, the review has been corrected to indicate that van Berlo et al (2014) observed that large numbers of endothelial cells, with some smooth muscle cells and fibroblasts, and more rarely cardiomyocytes, originated from c-kit positive progenitors in their murine model. The authors apologize for this error, and the error has been noted and corrected in the online version of the article, which is available at http://circres.ahajournals.org/content/116/7/1216.full ( PMID:25999426

  20. Correction

    NASA Astrophysics Data System (ADS)

    1998-12-01

    Alleged mosasaur bite marks on Late Cretaceous ammonites are limpet (patellogastropod) home scars Geology, v. 26, p. 947 950 (October 1998) This article had the following printing errors: p. 947, Abstract, line 11, “sepia” should be “septa” p. 947, 1st paragraph under Introduction, line 2, “creep” should be “deep” p. 948, column 1, 2nd paragraph, line 7, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 1, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 5, “19774” should be “1977)” p. 949, column 1, 4th paragraph, line 7, “in particular” should be “In particular” CORRECTION Mammalian community response to the latest Paleocene thermal maximum: An isotaphonomic study in the northern Bighorn Basin, Wyoming Geology, v. 26, p. 1011 1014 (November 1998) An error appeared in the References Cited. The correct reference appears below: Fricke, H. C., Clyde, W. C., O'Neil, J. R., and Gingerich, P. D., 1998, Evidence for rapid climate change in North America during the latest Paleocene thermal maximum: Oxygen isotope compositions of biogenic phosphate from the Bighorn Basin (Wyoming): Earth and Planetary Science Letters, v. 160, p. 193 208.

  1. Therapeutic correction of ApoER2 splicing in Alzheimer's disease mice using antisense oligonucleotides.

    PubMed

    Hinrich, Anthony J; Jodelka, Francine M; Chang, Jennifer L; Brutman, Daniella; Bruno, Angela M; Briggs, Clark A; James, Bryan D; Stutzmann, Grace E; Bennett, David A; Miller, Steven A; Rigo, Frank; Marr, Robert A; Hastings, Michelle L

    2016-01-01

    Apolipoprotein E receptor 2 (ApoER2) is an apolipoprotein E receptor involved in long-term potentiation, learning, and memory. Given its role in cognition and its association with the Alzheimer's disease (AD) risk gene, apoE, ApoER2 has been proposed to be involved in AD, though a role for the receptor in the disease is not clear. ApoER2 signaling requires amino acids encoded by alternatively spliced exon 19. Here, we report that the balance of ApoER2 exon 19 splicing is deregulated in postmortem brain tissue from AD patients and in a transgenic mouse model of AD To test the role of deregulated ApoER2 splicing in AD, we designed an antisense oligonucleotide (ASO) that increases exon 19 splicing. Treatment of AD mice with a single dose of ASO corrected ApoER2 splicing for up to 6 months and improved synaptic function and learning and memory. These results reveal an association between ApoER2 isoform expression and AD, and provide preclinical evidence for the utility of ASOs as a therapeutic approach to mitigate Alzheimer's disease symptoms by improving ApoER2 exon 19 splicing. PMID:26902204

  2. Travel cost demand model based river recreation benefit estimates with on-site and household surveys: Comparative results and a correction procedure

    NASA Astrophysics Data System (ADS)

    Loomis, John

    2003-04-01

    Past recreation studies have noted that on-site or visitor intercept surveys are subject to over-sampling of avid users (i.e., endogenous stratification) and have offered econometric solutions to correct for this. However, past papers do not estimate the empirical magnitude of the bias in benefit estimates with a real data set, nor do they compare the corrected estimates to benefit estimates derived from a population sample. This paper empirically examines the magnitude of the recreation benefits per trip bias by comparing estimates from an on-site river visitor intercept survey to a household survey. The difference in average benefits is quite large, with the on-site visitor survey yielding 24 per day trip, while the household survey yields 9.67 per day trip. A simple econometric correction for endogenous stratification in our count data model lowers the benefit estimate to $9.60 per day trip, a mean value nearly identical and not statistically different from the household survey estimate.

  3. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization

    PubMed Central

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F.; Pal, Saikat; McWalter, Emily J.; Beaupré, Gary S.; Maier, Andreas

    2013-01-01

    Purpose: Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. Methods: An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers’ location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4–12) and

  4. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2013-10-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose (18F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in most

  5. The impact of dosimetric optimization using respiratory gating and inhomogeneity corrections on potential therapeutic gain in patients with lung cancer

    NASA Astrophysics Data System (ADS)

    de La Fuente Herman, Tania

    Early stage lung cancer is found with increasing frequency by screening high risk patients. Recently, the use of Stereotactic Body Radiation Therapy (SBRT) has been found to be highly successful. The hypothesis being tested here is that the use of respiratory gating and tissue heterogeneity corrections are necessary to optimize tumor and normal tissue dose distributions for SBRT.

  6. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC.

    PubMed

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  7. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC

    PubMed Central

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  8. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  9. Key factors which concur to the correct therapeutic evaluation of herbal products in free radical-induced diseases.

    PubMed

    Mancuso, Cesare

    2015-01-01

    For many years now the world's scientific literature has been perfused with articles on the therapeutic potential of natural products, the vast majority of which have herbal origins, as in the case of free radical-induced diseases. What is often overlooked is the effort of researchers who take into consideration the preclinical and clinical evaluation of these herbal products, in order to demonstrate the therapeutic efficacy and safety. The first critical issue to be addressed in the early stages of the preclinical studies is related to pharmacokinetics, which is sometimes not very favorable, of some of these products, which limits the bioavailability after oral intake. In this regard, it is worthy underlining how it is often unethical to propose the therapeutic efficacy of a compound on the basis of preclinical results obtained with far higher concentrations to those which, hopefully, could be achieved in organs and tissues of subjects taking these products by mouth. The most widely used approach to overcome the problem related to the low bioavailability involves the complexation of the active ingredients of herbal products with non-toxic carriers that facilitate the absorption and distribution. Even the induction or inhibition of drug metabolizing enzymes by herbal products, and the consequent variations of plasma concentrations of co-administered drugs, are phenomena to be carefully evaluated as they can give rise to side-effects. This risk is even greater when considering that people lack the perception of the risk arising from an over use of herbal products that, by their very nature, are considered risk-free. PMID:25954201

  10. Key factors which concur to the correct therapeutic evaluation of herbal products in free radical-induced diseases

    PubMed Central

    Mancuso, Cesare

    2015-01-01

    For many years now the world’s scientific literature has been perfused with articles on the therapeutic potential of natural products, the vast majority of which have herbal origins, as in the case of free radical-induced diseases. What is often overlooked is the effort of researchers who take into consideration the preclinical and clinical evaluation of these herbal products, in order to demonstrate the therapeutic efficacy and safety. The first critical issue to be addressed in the early stages of the preclinical studies is related to pharmacokinetics, which is sometimes not very favorable, of some of these products, which limits the bioavailability after oral intake. In this regard, it is worthy underlining how it is often unethical to propose the therapeutic efficacy of a compound on the basis of preclinical results obtained with far higher concentrations to those which, hopefully, could be achieved in organs and tissues of subjects taking these products by mouth. The most widely used approach to overcome the problem related to the low bioavailability involves the complexation of the active ingredients of herbal products with non-toxic carriers that facilitate the absorption and distribution. Even the induction or inhibition of drug metabolizing enzymes by herbal products, and the consequent variations of plasma concentrations of co-administered drugs, are phenomena to be carefully evaluated as they can give rise to side-effects. This risk is even greater when considering that people lack the perception of the risk arising from an over use of herbal products that, by their very nature, are considered risk-free. PMID:25954201

  11. Lack of Correlation between Outcomes of Membrane Repair Assay and Correction of Dystrophic Changes in Experimental Therapeutic Strategy in Dysferlinopathy

    PubMed Central

    Krahn, Martin; Pryadkina, Marina; Borel, Perrine; Suel, Laurence; Roche, Joseph A.; Stockholm, Daniel; Bloch, Robert J.; Levy, Nicolas; Bashir, Rumaisa; Richard, Isabelle

    2012-01-01

    Mutations in the dysferlin gene are the cause of Limb-girdle Muscular Dystrophy type 2B and Miyoshi Myopathy. The dysferlin protein has been implicated in sarcolemmal resealing, leading to the idea that the pathophysiology of dysferlin deficiencies is due to a deficit in membrane repair. Here, we show using two different approaches that fullfiling membrane repair as asseyed by laser wounding assay is not sufficient for alleviating the dysferlin deficient pathology. First, we generated a transgenic mouse overexpressing myoferlin to test the hypothesis that myoferlin, which is homologous to dysferlin, can compensate for the absence of dysferlin. The myoferlin overexpressors show no skeletal muscle abnormalities, and crossing them with a dysferlin-deficient model rescues the membrane fusion defect present in dysferlin-deficient mice in vitro. However, myoferlin overexpression does not correct muscle histology in vivo. Second, we report that AAV-mediated transfer of a minidysferlin, previously shown to correct the membrane repair deficit in vitro, also fails to improve muscle histology. Furthermore, neither myoferlin nor the minidysferlin prevented myofiber degeneration following eccentric exercise. Our data suggest that the pathogenicity of dysferlin deficiency is not solely related to impairment in sarcolemmal repair and highlight the care needed in selecting assays to assess potential therapies for dysferlinopathies. PMID:22666441

  12. Lack of correlation between outcomes of membrane repair assay and correction of dystrophic changes in experimental therapeutic strategy in dysferlinopathy.

    PubMed

    Lostal, William; Bartoli, Marc; Roudaut, Carinne; Bourg, Nathalie; Krahn, Martin; Pryadkina, Marina; Borel, Perrine; Suel, Laurence; Roche, Joseph A; Stockholm, Daniel; Bloch, Robert J; Levy, Nicolas; Bashir, Rumaisa; Richard, Isabelle

    2012-01-01

    Mutations in the dysferlin gene are the cause of Limb-girdle Muscular Dystrophy type 2B and Miyoshi Myopathy. The dysferlin protein has been implicated in sarcolemmal resealing, leading to the idea that the pathophysiology of dysferlin deficiencies is due to a deficit in membrane repair. Here, we show using two different approaches that fulfilling membrane repair as asseyed by laser wounding assay is not sufficient for alleviating the dysferlin deficient pathology. First, we generated a transgenic mouse overexpressing myoferlin to test the hypothesis that myoferlin, which is homologous to dysferlin, can compensate for the absence of dysferlin. The myoferlin overexpressors show no skeletal muscle abnormalities, and crossing them with a dysferlin-deficient model rescues the membrane fusion defect present in dysferlin-deficient mice in vitro. However, myoferlin overexpression does not correct muscle histology in vivo. Second, we report that AAV-mediated transfer of a minidysferlin, previously shown to correct the membrane repair deficit in vitro, also fails to improve muscle histology. Furthermore, neither myoferlin nor the minidysferlin prevented myofiber degeneration following eccentric exercise. Our data suggest that the pathogenicity of dysferlin deficiency is not solely related to impairment in sarcolemmal repair and highlight the care needed in selecting assays to assess potential therapies for dysferlinopathies. PMID:22666441

  13. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  14. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  15. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  16. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  17. Are therapeutic communities therapeutic for women?

    PubMed Central

    Eliason, Michele J

    2006-01-01

    This paper addresses the growing phenomena of therapeutic community (TC) treatment approaches for women in correctional settings. Although rapidly increasing in number across the country, there is very little empirical research to support the effectiveness of TC treatment for women. Therefore, the literature on the efficacy and effectiveness of TC treatment for women is reviewed in relation to the literature on women's treatment issues. The literature review highlights the gaps where TC treatment ignores or exacerbates issues that are common to addicted women, or uses methods that may be contradictory to women's recovery. PMID:16722560

  18. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  19. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  20. Model-based machine learning

    PubMed Central

    Bishop, Christopher M.

    2013-01-01

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  1. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  2. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  3. MACROMOLECULAR THERAPEUTICS

    PubMed Central

    Yang, Jiyuan; Kopeček, Jindřich

    2014-01-01

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines – (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. PMID:24747162

  4. Pigeon therapeutics.

    PubMed

    Harlin, R W

    2000-01-01

    This article examines therapeutics for pigeons, discussing their physiology and reproduction, housing, and nutrition. The author also looks at ways to prevent infection, while discussing treatments for various viral diseases, such as paramyxovirus and pigeon herpesvirus, bacterial infections, such as paratyphoid, and parasitic diseases. Drug dosages are listed for antibiotics, antifungals, antiparasitics, and vaccines. PMID:11228828

  5. Feedlot therapeutics.

    PubMed

    Apley, M D; Fajt, V R

    1998-07-01

    This article discusses therapeutic approaches to conditions commonly encountered in feedlots. Challenges discussed include bovine respiratory complex, tracheal edema, atypical interstitial pneumonia, footrot, toe abscesses, mycoplasma arthritis, cardiovascular disease, lactic acidosis, bloat, coccidiosis, central nervous system diseases, abscesses and cellulitis, pregnancy management and abortion, and ocular disease. PMID:9704416

  6. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen; Ruegsegger, Mark; Barnes, Philip; Smith, Bryan; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'être of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multistep work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self-assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  7. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen C.; Ruegsegger, Mark; Barnes, Philip D.; Smith, Bryan R.; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'etre of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multi-step work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  8. Model-based reasoning: Troubleshooting

    NASA Astrophysics Data System (ADS)

    Davis, Randall; Hamscher, Walter C.

    1988-07-01

    To determine why something has stopped working, its useful to know how it was supposed to work in the first place. That simple observation underlies some of the considerable interest generated in recent years on the topic of model-based reasoning, particularly its application to diagnosis and troubleshooting. This paper surveys the current state of the art, reviewing areas that are well understood and exploring areas that present challenging research topics. It views the fundamental paradigm as the interaction of prediction and observation, and explores it by examining three fundamental subproblems: generating hypotheses by reasoning from a symptom to a collection of components whose misbehavior may plausibly have caused that symptom; testing each hypothesis to see whether it can account for all available observations of device behavior; then discriminating among the ones that survive testing. We analyze each of these independently at the knowledge level i.e., attempting to understand what reasoning capabilities arise from the different varieties of knowledge available to the program. We find that while a wide range of apparently diverse model-based systems have been built for diagnosis and troubleshooting, they are for the most part variations on the central theme outlined here. Their diversity lies primarily in the varying amounts of kinds of knowledge they bring to bear at each stage of the process; the underlying paradigm is fundamentally the same.

  9. Platelet-delivered therapeutics.

    PubMed

    Lyde, R; Sabatino, D; Sullivan, S K; Poncz, M

    2015-06-01

    We have proposed that modified platelets could potentially be used to correct intrinsic platelet defects as well as for targeted delivery of therapeutic molecules to sights of vascular injury. Ectopic expression of proteins within α-granules prior to platelet activation has been achieved for several proteins, including urokinase, factor (F) VIII, and partially for FIX. Potential uses of platelet-directed therapeutics will be discussed, focusing on targeted delivery of urokinase as a thromboprophylactic agent and FVIII for the treatment of hemophilia A patients with intractable inhibitors. This presentation will discuss new strategies that may be useful in the care of patients with vascular injury as well as remaining challenges and limitations of these approaches. PMID:26149015

  10. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  11. Therapeutic perspectives

    PubMed Central

    Fiore, Carmelo E.; Pennisi, Pietra; Tinè, Marianna

    2008-01-01

    Osteoporosis and atherosclerosis are linked by biological association. This encourages the search for therapeutic strategies having both cardiovascular and skeletal beneficial effects. Among drugs that may concordantly enhance bone density and reduce the progression of atherosclerosis we can include bisphosphonates (BP), statins, β -blockers, and possibly anti-RANKL antibodies. Available data come from experimental animals and human studies. All these treatments however lack controlled clinical studies designed to demonstrate dual-action effects. PMID:22460845

  12. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  13. Therapeutic alliance.

    PubMed

    Fox, Valerie

    2002-01-01

    I have been very fortunate in my journey of mental illness. I respond well to medication, but I don't think that is the complete answer to living successfully with serious, persistent mental illness. I believe a person's environment is also of utmost importance, enabling the person suffering with mental illness to continually grow in life. I found early in my struggle with mental illness a psychiatrist with whom I have always had a very good rapport. Until recently I didn't know that what I have with this psychiatrist is professionally known as a therapeutic alliance. Over the years, when I need someone to talk over anything that is troubling to me, I seek my psychiatrist. A therapeutic alliance is non-judgmental; it is nourishing; and finally it is a relationship of complete trust. Perhaps persons reading this article who have never experienced this alliance will seek it. I believe it can make an insecure person secure; a frightened person less frightened; and allow a person to continue the journey of mental health with a sense of belief in oneself. PMID:12433224

  14. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  15. Speech Correction in the Schools.

    ERIC Educational Resources Information Center

    Eisenson, Jon; Ogilvie, Mardel

    An introduction to the problems and therapeutic needs of school age children whose speech requires remedial attention, the text is intended for both the classroom teacher and the speech correctionist. General considerations include classification and incidence of speech defects, speech correction services, the teacher as a speaker, the mechanism…

  16. Jitter Correction

    NASA Technical Reports Server (NTRS)

    Waegell, Mordecai J.; Palacios, David M.

    2011-01-01

    Jitter_Correct.m is a MATLAB function that automatically measures and corrects inter-frame jitter in an image sequence to a user-specified precision. In addition, the algorithm dynamically adjusts the image sample size to increase the accuracy of the measurement. The Jitter_Correct.m function takes an image sequence with unknown frame-to-frame jitter and computes the translations of each frame (column and row, in pixels) relative to a chosen reference frame with sub-pixel accuracy. The translations are measured using a Cross Correlation Fourier transformation method in which the relative phase of the two transformed images is fit to a plane. The measured translations are then used to correct the inter-frame jitter of the image sequence. The function also dynamically expands the image sample size over which the cross-correlation is measured to increase the accuracy of the measurement. This increases the robustness of the measurement to variable magnitudes of inter-frame jitter

  17. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  18. Therapeutic Drug Monitoring

    MedlinePlus

    ... be limited. Home Visit Global Sites Search Help? Therapeutic Drug Monitoring Share this page: Was this page ... Monitored Drugs | Common Questions | Related Pages What is therapeutic drug monitoring? Therapeutic drug monitoring is the measurement ...

  19. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  20. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch. PMID:17839404

  1. Model-based satellite acquisition and tracking

    NASA Technical Reports Server (NTRS)

    Casasent, David; Lee, Andrew J.

    1988-01-01

    A model-based optical processor is introduced for the acquisition and tracking of a satellite in close proximity to an imaging sensor of a space robot. The type of satellite is known in advance, and a model of the satellite (which exists from its design) is used in this task. The model base is used to generate multiple smart filters of the various parts of the satellite, which are used in a symbolic multi-filter optical correlator. The output from the correlator is then treated as a symbolic description of the object, which is operated upon by an optical inference processor to determine the position and orientation of the satellite and to track it as a function of time. The knowledge and model base also serves to generate the rules used by the inference machine. The inference machine allows for feedback to optical correlators or feature extractors to locate the individual parts of the satellite and their orientations.

  2. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  3. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  4. Multimode model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.; Gregory, E.

    2016-02-01

    A newly-initiated research program for model-based defect characterization in CFRP composites is summarized. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing delamination and porosity. Forward predictions of measurement response are presented, as well as examples of model-based inversion of measured data for the estimation of defect parameters.

  5. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  6. What's Missing in Model-Based Teaching

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  7. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  8. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  9. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  10. Prediction model based on decision tree analysis for laccase mediators.

    PubMed

    Medina, Fabiola; Aguila, Sergio; Baratto, Maria Camilla; Martorana, Andrea; Basosi, Riccardo; Alderete, Joel B; Vazquez-Duhalt, Rafael

    2013-01-10

    A Structure Activity Relationship (SAR) study for laccase mediator systems was performed in order to correctly classify different natural phenolic mediators. Decision tree (DT) classification models with a set of five quantum-chemical calculated molecular descriptors were used. These descriptors included redox potential (ɛ°), ionization energy (E(i)), pK(a), enthalpy of formation of radical (Δ(f)H), and OH bond dissociation energy (D(O-H)). The rationale for selecting these descriptors is derived from the laccase-mediator mechanism. To validate the DT predictions, the kinetic constants of different compounds as laccase substrates, their ability for pesticide transformation as laccase-mediators, and radical stability were experimentally determined using Coriolopsis gallica laccase and the pesticide dichlorophen. The prediction capability of the DT model based on three proposed descriptors showed a complete agreement with the obtained experimental results. PMID:23199741

  11. Reducing Centroid Error Through Model-Based Noise Reduction

    NASA Technical Reports Server (NTRS)

    Lee, Shinhak

    2006-01-01

    A method of processing the digitized output of a charge-coupled device (CCD) image detector has been devised to enable reduction of the error in computed centroid of the image of a point source of light. The method involves model-based estimation of, and correction for, the contributions of bias and noise to the image data. The method could be used to advantage in any of a variety of applications in which there are requirements for measuring precise locations of, and/or precisely aiming optical instruments toward, point light sources. In the present method, prior to normal operations of the CCD, one measures the point-spread function (PSF) of the telescope or other optical system used to project images on the CCD. The PSF is used to construct a database of spot models representing the nominal CCD pixel outputs for a point light source projected onto the CCD at various positions incremented by small fractions of a pixel.

  12. A Model-Based System For Force Structure Analysis

    NASA Astrophysics Data System (ADS)

    Levitt, Tod S.; Kirby, Robert L.; Muller, Hans E.

    1985-04-01

    Given a set of image-derived vehicle detections and/or recognized military vehicles, SIGINT cues and a priori analysis of terrain, the force structure analysis (FSA) problem is to utilize knowledge of tactical doctrine and spatial deployment information to infer the existence of military forces such as batteries, companies, battalions, regiments, divisions, etc. A model-based system for FSA has been developed. It performs symbolic reasoning about force structures represented as geometric models. The FSA system is a stand-alone module which has also been developed as part of a larger system, the Advanced Digital Radar Image Exploitation System (ADRIES) for automated SAR image exploitation. The models recursively encode the component military units of a force structure, their expected spatial deployment, search priorities for model components, prior match probabilities, and type hierarchies for uncertain recognition. Partial and uncertain matching of models against data is the basic tool for building up hypotheses of the existence of force structures. Hypothesis management includes the functions of matching models against data, predicting the existence and location of unobserved force components, localization of search areas and resolution of conflicts between competing hypotheses. A subjective Bayesian inference calculus is used to accrue certainty of force structure hypotheses and resolve conflicts. Reasoning from uncertain vehicle level data, the system has successfully inferred the correct locations and components of force structures up to the battalion level. Key words: Force structure analysis, SAR, model-based reasoning, hypothesis management, search, matching, conflict resolution, Bayesian inference, uncertainty.

  13. Model-based Processing of Microcantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Sinensky, A K; Lee, C L; Rudd, R E; Burnham, A K

    2005-04-27

    We have developed a model-based processor (MBP) for a microcantilever-array sensor to detect target species in solution. We perform a proof-of-concept experiment, fit model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest, averaged deflection data and multi-channel data. For this evaluation we extract model parameters via a model-based estimation, perform a Gauss-Markov simulation, design the optimal MBP and apply it to measured experimental data. The performance of the MBP in the multi-channel case is evaluated by comparison to a ''smoother'' (averager) typically used for microcantilever signal analysis. It is shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, apart from a correctable systematic bias error.

  14. Therapeutic drug levels

    MedlinePlus

    ... medlineplus.gov/ency/article/003430.htm Therapeutic drug levels To use the sharing features on this page, please enable JavaScript. Therapeutic drug levels are lab tests to look for the presence ...

  15. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  16. Efficient Model-Based Diagnosis Engine

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  17. Enzyme therapeutics for systemic detoxification.

    PubMed

    Liu, Yang; Li, Jie; Lu, Yunfeng

    2015-08-01

    Life relies on numerous biochemical processes working synergistically and correctly. Certain substances disrupt these processes, inducing living organism into an abnormal state termed intoxication. Managing intoxication usually requires interventions, which is referred as detoxification. Decades of development on detoxification reveals the potential of enzymes as ideal therapeutics and antidotes, because their high substrate specificity and catalytic efficiency are essential for clearing intoxicating substances without adverse effects. However, intrinsic shortcomings of enzymes including low stability and high immunogenicity are major hurdles, which could be overcome by delivering enzymes with specially designed nanocarriers. Extensive investigations on protein delivery indicate three types of enzyme-nanocarrier architectures that show more promise than others for systemic detoxification, including liposome-wrapped enzymes, polymer-enzyme conjugates, and polymer-encapsulated enzymes. This review highlights recent advances in these nano-architectures and discusses their applications in systemic detoxifications. Therapeutic potential of various enzymes as well as associated challenges in achieving effective delivery of therapeutic enzymes will also be discussed. PMID:25980935

  18. Radiometric terrain correction of SPOT5 image

    NASA Astrophysics Data System (ADS)

    Feng, Xiuli; Zhang, Feng; Wang, Ke

    2007-06-01

    terrain correction model based on the rationale of moment matching is an effective model to reduce the shade effect than the traditional C correction approach, especially in the complex undulation of mountain area with lots of shade effect. In other words, the traditional C correction approach will show the better result at the plain area with less shade effect. Besides, the accuracy of the DEM data and the registration accuracy between the image and the DEM data will also influence the final correction accuracy. In order to achieve the higher radiometric terrain correction, high spatial resolution DEM data is preferred.

  19. Model-based pulse shape correction for CdTe detectors

    NASA Astrophysics Data System (ADS)

    Bargholtz, Chr.; Fumero, E.; Mårtensson, L.

    1999-02-01

    We present a systematic method to improve energy resolution of CdTe-detector systems with full control of the efficiency. Sampled pulses and multiple amplifier data are fitted by a model of the pulse shape including the deposited energy and the interaction point within the detector as parameters. We show the decisive improvements of spectral resolution and photo-peak efficiency that is obtained without distortion of spectral shape. The information concerning the interaction depth of individual events can be used to discriminate between beta particles and gamma quanta.

  20. Model-based multiple patterning layout decomposition

    NASA Astrophysics Data System (ADS)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  1. MODEL-BASED IMAGE RECONSTRUCTION FOR MRI

    PubMed Central

    Fessler, Jeffrey A.

    2010-01-01

    Magnetic resonance imaging (MRI) is a sophisticated and versatile medical imaging modality. Traditionally, MR images are reconstructed from the raw measurements by a simple inverse 2D or 3D fast Fourier transform (FFT). However, there are a growing number of MRI applications where a simple inverse FFT is inadequate, e.g., due to non-Cartesian sampling patterns, non-Fourier physical effects, nonlinear magnetic fields, or deliberate under-sampling to reduce scan times. Such considerations have led to increasing interest in methods for model-based image reconstruction in MRI. PMID:21135916

  2. Model-based Tomographic Reconstruction Literature Search

    SciTech Connect

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  3. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  4. Vector space model based on semantic relatedness

    NASA Astrophysics Data System (ADS)

    Bondarchuk, Dmitry; Timofeeva, Galina

    2015-11-01

    Most of data-mining methods are based on the vector space model of knowledge representation. The vector space model uses the frequency of a term in order to determine its relevance in a document. Terms can be similar by semantic meaning but be lexicographically different ones, so the classification based on the frequency of terms does not give desired results in some subject areas such as the vacancies selection. The modified vector space model based on the semantic relatedness is suggested for data-mining in this area. Evaluation results show that the proposed algorithm is better then one based on the standard vector space model.

  5. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  6. Sandboxes for Model-Based Inquiry

    NASA Astrophysics Data System (ADS)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  7. Model based 3D segmentation and OCT image undistortion of percutaneous implants.

    PubMed

    Müller, Oliver; Donner, Sabine; Klinder, Tobias; Dragon, Ralf; Bartsch, Ivonne; Witte, Frank; Krüger, Alexander; Heisterkamp, Alexander; Rosenhahn, Bodo

    2011-01-01

    Optical Coherence Tomography (OCT) is a noninvasive imaging technique which is used here for in vivo biocompatibility studies of percutaneous implants. A prerequisite for a morphometric analysis of the OCT images is the correction of optical distortions caused by the index of refraction in the tissue. We propose a fully automatic approach for 3D segmentation of percutaneous implants using Markov random fields. Refraction correction is done by using the subcutaneous implant base as a prior for model based estimation of the refractive index using a generalized Hough transform. Experiments show the competitiveness of our algorithm towards manual segmentations done by experts. PMID:22003731

  8. Model-based Processing of Micro-cantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Lee, C L; Rudd, R E; Burnham, A K

    2004-11-17

    We develop a model-based processor (MBP) for a micro-cantilever array sensor to detect target species in solution. After discussing the generalized framework for this problem, we develop the specific model used in this study. We perform a proof-of-concept experiment, fit the model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest: (1) averaged deflection data, and (2) multi-channel data. In both cases the evaluation proceeds by first performing a model-based parameter estimation to extract the model parameters, next performing a Gauss-Markov simulation, designing the optimal MBP and finally applying it to measured experimental data. The simulation is used to evaluate the performance of the MBP in the multi-channel case and compare it to a ''smoother'' (''averager'') typically used in this application. It was shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, though it includes a correctable systematic bias error. The project's primary accomplishment was the successful application of model-based processing to signals from micro-cantilever arrays: 40-60 dB improvement vs. the smoother algorithm was demonstrated. This result was achieved through the development of appropriate mathematical descriptions for the chemical and mechanical phenomena, and incorporation of these descriptions directly into the model-based signal processor. A significant challenge was the development of the framework which would maximize the usefulness of the signal processing algorithms while ensuring the accuracy of the mathematical description of the chemical-mechanical signal. Experimentally, the difficulty was to identify and characterize the non

  9. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  10. Therapeutic Recreation Practicum Manual.

    ERIC Educational Resources Information Center

    Schneegas, Kay

    This manual provides information on the practicum program offered by Moraine Valley Community College (MVCC) for students in its therapeutic recreation program. Sections I and II outline the rationale and goals for providing practical, on-the-job work experiences for therapeutic recreation students. Section III specifies MVCC's responsibilities…

  11. Chicanoizing the Therapeutic Community

    ERIC Educational Resources Information Center

    Aron, William S.; And Others

    1974-01-01

    Focusing on the drug addiction problem and its antecedent conditions in a Chicano population, the article examines several therapeutic interventions suggested by these conditions and indicates how they might be incorporated into a drug addiction Therapeutic Community treatment program designed to meet the needs of Chicano drug addicts. (Author/NQ)

  12. Rx for Pedagogical Correctness: Professional Correctness.

    ERIC Educational Resources Information Center

    Lasley, Thomas J.

    1993-01-01

    Describes the difficulties caused by educators holding to a view of teaching that assumes that there is one "pedagogically correct" way of running a classroom. Provides three examples of harmful pedagogical correctness ("untracked" classes, cooperative learning, and testing and test-wiseness). Argues that such dogmatic views of education limit…

  13. 77 FR 72199 - Technical Corrections; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ...) is correcting a final rule that was published in the Federal Register on July 6, 2012 (77 FR 39899... . SUPPLEMENTARY INFORMATION: On July 6, 2012 (77 FR 39899), the NRC published a final rule in the Federal Register... typographical and spelling errors, and making other edits and conforming changes. This correcting amendment...

  14. Model-based vision for space applications

    NASA Technical Reports Server (NTRS)

    Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald

    1992-01-01

    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.

  15. Model Based Reconstruction of UT Array Data

    NASA Astrophysics Data System (ADS)

    Calmon, P.; Iakovleva, E.; Fidahoussen, A.; Ribay, G.; Chatillon, S.

    2008-02-01

    Beyond the detection of defects, their characterization (identification, positioning, sizing) is one goal of great importance often assigned to the analysis of NDT data. The first step of such analysis in the case of ultrasonic testing amounts to image in the part the detected echoes. This operation is in general achieved by considering time of flights and by applying simplified algorithms which are often valid only on canonical situations. In this communication we present an overview of different imaging techniques studied at CEA LIST and based on the exploitation of direct models which enable to address complex configurations and are available in the CIVA software plat-form. We discuss in particular ray-model based algorithms, algorithms derived from classical synthetic focusing and processing of the full inter-element matrix (MUSIC algorithm).

  16. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  17. Concept Modeling-based Drug Repositioning

    PubMed Central

    Patchala, Jagadeesh; Jegga, Anil G

    2015-01-01

    Our hypothesis is that drugs and diseases sharing similar biomedical and genomic concepts are likely to be related, and thus repositioning opportunities can be identified by ranking drugs based on the incidence of shared similar concepts with diseases and vice versa. To test this, we constructed a probabilistic topic model based on the Unified Medical Language System (UMLS) concepts that appear in the disease and drug related abstracts in MEDLINE. The resulting probabilistic topic associations were used to measure the similarity between disease and drugs. The success of the proposed model is evaluated using a set of repositioned drugs, and comparing a drug’s ranking based on its similarity to the original and new indication. We then applied the model to rare disorders and compared them to all approved drugs to facilitate “systematically serendipitous” discovery of relationships between rare diseases and existing drugs, some of which could be potential repositioning candidates. PMID:26306277

  18. Model-based reconfiguration: Diagnosis and recovery

    NASA Technical Reports Server (NTRS)

    Crow, Judy; Rushby, John

    1994-01-01

    We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.

  19. Model-Based Vision Using Relational Summaries

    NASA Astrophysics Data System (ADS)

    Lu, Haiyuan; Shapiro, Linda G.

    1989-03-01

    A CAD-to-vision system is a computer system that inputs a CAD model of an object and outputs a vision model and matching procedure by which that object can be recognized and/or its position and orientation determined. CAD-model-based systems are extremely useful for industrial vision tasks where a number of different manufactured parts must be automatically manipulated and/or inspected. Another area where vision systems based on CAD models is becoming important is in the United States space program. Since the space station and space vehicles are recent or even current designs, we can expect to have CAD models of these objects to work with. Vision tasks in space such as docking and tracking of vehicles, guided assembly tasks, and inspection of the space station itself for cracks and other problems can rely on model-directed vision techniques.

  20. Model-based ocean acoustic passive localization

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-01-24

    The detection, localization and classification of acoustic sources (targets) in a hostile ocean environment is a difficult problem -- especially in light of the improved design of modern submarines and the continual improvement in quieting technology. Further the advent of more and more diesel-powered vessels makes the detection problem even more formidable than ever before. It has recently been recognized that the incorporation of a mathematical model that accurately represents the phenomenology under investigation can vastly improve the performance of any processor, assuming, of course, that the model is accurate. Therefore, it is necessary to incorporate more knowledge about the ocean environment into detection and localization algorithms in order to enhance the overall signal-to-noise ratios and improve performance. An alternative methodology to matched-field/matched-mode processing is the so-called model-based processor which is based on a state-space representation of the normal-mode propagation model. If state-space solutions can be accomplished, then many of the current ocean acoustic processing problems can be analyzed and solved using this framework to analyze performance results based on firm statistical and system theoretic grounds. The model-based approach, is (simply) ``incorporating mathematical models of both physical phenomenology and the measurement processes including noise into the processor to extract the desired information.`` In this application, we seek techniques to incorporate the: (1) ocean acoustic propagation model; (2) sensor array measurement model; and (3) noise models (ambient, shipping, surface and measurement) into a processor to solve the associated localization/detection problems.

  1. Fast Algorithms for Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan

    2005-01-01

    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  2. Cytokines and therapeutic oligonucleotides.

    PubMed

    Hartmann, G; Bidlingmaier, M; Eigler, A; Hacker, U; Endres, S

    1997-12-01

    Therapeutic oligonucleotides - short strands of synthetic nucleic acids - encompass antisense and aptamer oligonucleotides. Antisense oligonucleotides are designed to bind to target RNA by complementary base pairing and to inhibit translation of the target protein. Antisense oligonucleotides enable specific inhibition of cytokine synthesis. In contrast, aptamer oligonucleotides are able to bind directly to specific proteins. This binding depends on the sequence of the oligonucleotide. Aptamer oligonucleotides with CpG motifs can exert strong immunostimulatory effects. Both kinds of therapeutic oligonucleotides - antisense and aptamer oligonucleotides - provide promising tools to modulate immunological functions. Recently, therapeutic oligonucleotides have moved towards clinical application. An antisense oligonucleotide directed against the proinflammatory intercellular adhesion molecule 1 (ICAM-1) is currently being tested in clinical trials for therapy of inflammatory disease. Immunostimulatory aptamer oligonucleotides are in preclinical development for immunotherapy. In the present review we summarize the application of therapeutic oligonucleotides to modulate immunological functions. We include technological aspects as well as current therapeutic concepts and clinical studies. PMID:9740353

  3. Reporting therapeutic discourse in a therapeutic community.

    PubMed

    Chapman, G E

    1988-03-01

    Research in nurses' communications has concentrated on nurse to patient interactions. Those few studies which focus on nurse to nurse communications seem to be generated by a pragmatic and normative concern with effective information sharing. In this paper, which describes one aspect of a larger case study of a hospital-based therapeutic community, the description and analysis of nurses' reports flows not from a normative model of professional practice, but rather an exploration of how professional practice is articulated as discourse in nurses' written accounts. Foucault's ideas about therapeutic discourse inform the theoretical framework of the research. Ethnomethodological concerns with the importance of documentary analysis provide the methodological rationale for examining nurses' 24-hour report documents, as official discourse, reflecting therapeutic practice in this setting. A content analysis of nurses' reports, collected over a period of 4 months, demonstrated the importance of domesticity and ordinary everyday activities in nurses' accounts of hospital life. Disruption to the 'life as usual' domesticity in the community seemed to be associated with admission to and discharge from the hospital when interpersonal and interactional changes between patients occur. It is suggested that nurses in general hospital wards and more orthodox psychiatric settings might usefully consider the impact of admissions and discharges on the group of patients they manage, and make this a discursive focus of their work. PMID:3372900

  4. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  5. An application of model-based reasoning to accounting systems

    SciTech Connect

    Nado, R.; Chams, M.; Delisio, J.; Hamscher, W.

    1996-12-31

    An important problem faced by auditors is gauging how much reliance can be placed on the accounting systems that process millions of transactions to produce the numbers summarized in a company`s financial statements. Accounting systems contain internal controls, procedures designed to detect and correct errors and irregularities that may occur in the processing of transactions. In a complex accounting system, it can be an extremely difficult task for the auditor to anticipate the possible errors that can occur and to evaluate the effectiveness of the controls at detecting them. An accurate analysis must take into account the unique features of each company`s business processes. To cope with this complexity and variability, the Comet system applies a model-based reasoning approach to the analysis of accounting systems and their controls. An auditor uses Comet to create a hierarchical flowchart model that describes the intended processing of business transactions by an accounting system and the operation of its controls. Comet uses the constructed model to automatically analyze the effectiveness of the controls in detecting potential errors. Price Waterhouse auditors have used Comet on a variety of real audits in several countries around the world.

  6. [Therapeutic neuromodulation in primary headaches].

    PubMed

    May, A; Jürgens, T P

    2011-06-01

    Neuromodulatory techniques have developed rapidly in the therapeutic management of refractory headaches. Invasive procedures comprise peripheral nerve stimulation (particularly occipital nerve stimulation), vagus nerve stimulation, cervical spinal cord stimulation and hypothalamic deep brain stimulation. Transcutaneous electrical nerve stimulation, repetitive transcranial magnetic stimulation and transcranial direct current stimulation are noninvasive variants. Based on current neuroimaging, neurophysiological and clinical studies occipital nerve stimulation and hypothalamic deep brain stimulation are recommended for patients with chronic cluster headache. Less convincing evidence can be found for their use in other refractory headaches such as chronic migraine. No clear recommendation can be given for the other neuromodulatory techniques. The emerging concept of intermittent stimulation of the sphenopalatine ganglion is nonetheless promising. Robust randomized and sham-controlled multicenter studies are needed before these therapeutic approaches are widely implemented. Due to the experimental nature all patients should be treated in clinical studies. It is essential to confirm the correct headache diagnosis and the refractory nature before an invasive approach is considered. Patients should generally be referred to specialized interdisciplinary outpatient departments which closely collaborate with neurosurgeons who are experienced in the implantation of neuromodulatory devices. It is crucial to ensure a competent postoperative follow-up with optimization of stimulation parameters and adjustment of medication. PMID:20972665

  7. Biomimetic Particles as Therapeutics

    PubMed Central

    Green, Jordan J.

    2015-01-01

    In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289

  8. An overview of correctional psychiatry.

    PubMed

    Metzner, Jeffrey; Dvoskin, Joel

    2006-09-01

    Supermax facilities may be an unfortunate and unpleasant necessity in modern corrections. Because of the serious dangers posed by prison gangs, they are unlikely to disappear completely from the correctional landscape any time soon. But such units should be carefully reserved for those inmates who pose the most serious danger to the prison environment. Further, the constitutional duty to provide medical and mental health care does not end at the supermax door. There is a great deal of common ground between the opponents of such environments and those who view them as a necessity. No one should want these expensive beds to be used for people who could be more therapeutically and safely managed in mental health treatment environments. No one should want people with serious mental illnesses to be punished for their symptoms. Finally, no one wants these units to make people more, instead of less, dangerous. It is in everyone's interests to learn as much as possible about the potential of these units for good and for harm. Corrections is a profession, and professions base their practices on data. If we are to avoid the most egregious and harmful effects of supermax confinement, we need to understand them far better than we currently do. Though there is a role for advocacy from those supporting or opposed to such environments, there is also a need for objective, scientifically rigorous study of these units and the people who live there. PMID:16904510

  9. Is the concept of corrective emotional experience still topical?

    PubMed

    Palvarini, Paolo

    2010-01-01

    This article gives a historical review of the literature concerned with the role of emotional factors in psychoanalysis. The author then focuses on Alexander's milestone contribution and above all, on the concept he developed of corrective emotional experience. Alexander moves gradually over time from the classical position, which gives insight a place of choice, to a more radical view, in which, the most effective therapeutic factor is represented by the emotional experience within the therapeutic relationship. The article includes a review of the literature relevant to the concept of corrective emotional experience. Finally, Experiential-Dynamic Psychotherapy, a therapeutic approach giving a prominent role to the therapeutic power of corrective emotional experience is considered. Two vignettes from a psychotherapy carried out according to the principles of Experiential-Dynamic Psychotherapy provide examples of how this model is applied clinically. PMID:20617789

  10. Eyeglasses for Vision Correction

    MedlinePlus

    ... Stories Español Eye Health / Glasses & Contacts Eyeglasses for Vision Correction Dec. 12, 2015 Wearing eyeglasses is an easy way to correct refractive errors. Improving your vision with eyeglasses offers the opportunity to select from ...

  11. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  12. Enhancing model based forecasting of geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Webb, Alla G.

    Modern society is increasingly dependent on the smooth operation of large scale technology supporting Earth based activities such as communication, electricity distribution, and navigation. This technology is potentially threatened by global geomagnetic storms, which are caused by the impact of plasma ejected from the Sun upon the protective magnetic field that surrounds the Earth. Forecasting the timing and magnitude of these geomagnetic storms is part of the emerging discipline of space weather. The most severe geomagnetic storms are caused by magnetic clouds, whose properties and characteristics are important variables in space weather forecasting systems. The methodology presented here is the development of a new statistical approach to characterize the physical properties (variables) of the magnetic clouds and to examine the extent to which theoretical models can be used in describing both of these physical properties, as well as their evolution in space and time. Since space weather forecasting is a complex system, a systems engineering approach is used to perform analysis, validation, and verification of the magnetic cloud models (subsystem of the forecasting system) using a model-based methodology. This research demonstrates that in order to validate magnetic cloud models, it is important to categorize the data by physical parameters such as velocity and distance travelled. This understanding will improve the modeling accuracy of magnetic clouds in space weather forecasting systems and hence increase forecasting accuracy of geomagnetic storms and their impact on earth systems.

  13. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  14. Model-based target and background characterization

    NASA Astrophysics Data System (ADS)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  15. [Fast spectral modeling based on Voigt peaks].

    PubMed

    Li, Jin-rong; Dai, Lian-kui

    2012-03-01

    Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612

  16. Engineering antibody therapeutics.

    PubMed

    Chiu, Mark L; Gilliland, Gary L

    2016-06-01

    The successful introduction of antibody-based protein therapeutics into the arsenal of treatments for patients has within a few decades fostered intense innovation in the production and engineering of antibodies. Reviewed here are the methods currently used to produce antibodies along with how our knowledge of the structural and functional characterization of immunoglobulins has resulted in the engineering of antibodies to produce protein therapeutics with unique properties, both biological and biophysical, that are leading to novel therapeutic approaches. Antibody engineering includes the introduction of the antibody combining site (variable regions) into a host of architectures including bi and multi-specific formats that further impact the therapeutic properties leading to further advantages and successes in patient treatment. PMID:27525816

  17. Research in Correctional Rehabilitation.

    ERIC Educational Resources Information Center

    Rehabilitation Services Administration (DHEW), Washington, DC.

    Forty-three leaders in corrections and rehabilitation participated in the seminar planned to provide an indication of the status of research in correctional rehabilitation. Papers include: (1) "Program Trends in Correctional Rehabilitation" by John P. Conrad, (2) "Federal Offenders Rahabilitation Program" by Percy B. Bell and Merlyn Mathews, (3)…

  18. An operator model-based filtering scheme

    SciTech Connect

    Sawhney, R.S.; Dodds, H.L. ); Schryer, J.C. )

    1990-01-01

    This paper presents a diagnostic model developed at Oak Ridge National Laboratory (ORNL) for off-normal nuclear power plant events. The diagnostic model is intended to serve as an embedded module of a cognitive model of the human operator, one application of which could be to assist control room operators in correctly responding to off-normal events by providing a rapid and accurate assessment of alarm patterns and parameter trends. The sequential filter model is comprised of two distinct subsystems --- an alarm analysis followed by an analysis of interpreted plant signals. During the alarm analysis phase, the alarm pattern is evaluated to generate hypotheses of possible initiating events in order of likelihood of occurrence. Each hypothesis is further evaluated through analysis of the current trends of state variables in order to validate/reject (in the form of increased/decreased certainty factor) the given hypothesis. 7 refs., 4 figs.

  19. Corrective Action Glossary

    SciTech Connect

    Not Available

    1992-07-01

    The glossary of technical terms was prepared to facilitate the use of the Corrective Action Plan (CAP) issued by OSWER on November 14, 1986. The CAP presents model scopes of work for all phases of a corrective action program, including the RCRA Facility Investigation (RFI), Corrective Measures Study (CMS), Corrective Measures Implementation (CMI), and interim measures. The Corrective Action Glossary includes brief definitions of the technical terms used in the CAP and explains how they are used. In addition, expected ranges (where applicable) are provided. Parameters or terms not discussed in the CAP, but commonly associated with site investigations or remediations are also included.

  20. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  1. Development of explicit diffraction corrections for absolute measurements of acoustic nonlinearity parameters in the quasilinear regime.

    PubMed

    Jeong, Hyunjo; Zhang, Shuzeng; Cho, Sungjong; Li, Xiongbing

    2016-08-01

    In absolute measurements of acoustic nonlinearity parameters, amplitudes of harmonics must be corrected for diffraction effects. In this study, we develop explicit multi-Gaussian beam (MGB) model-based diffraction corrections for the first three harmonics in weakly nonlinear, axisymmetric sound beams. The effects of making diffraction corrections on nonlinearity parameter estimation are investigated by defining "total diffraction correction (TDC)". The results demonstrate that TDC cannot be neglected even for harmonic generation experiments in the nearfield region. PMID:27186964

  2. A model-based approach for making ecological inference from distance sampling data.

    PubMed

    Johnson, Devin S; Laake, Jeffrey L; Ver Hoef, Jay M

    2010-03-01

    We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike's information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function. PMID:19459840

  3. MACE: model based analysis of ChIP-exo.

    PubMed

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J; Zimmermann, Michael T; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T; Huang, Haojie; Wilson, Michael D; Kocher, Jean-Pierre A; Li, Wei

    2014-11-10

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  4. MACE: model based analysis of ChIP-exo

    PubMed Central

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J.; Zimmermann, Michael T.; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T.; Huang, Haojie; Wilson, Michael D.; Kocher, Jean-Pierre A.; Li, Wei

    2014-01-01

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  5. Therapeutic Antioxidant Medical Gas

    PubMed Central

    Nakao, Atsunori; Sugimoto, Ryujiro; Billiar, Timothy R; McCurry, Kenneth R

    2009-01-01

    Medical gases are pharmaceutical gaseous molecules which offer solutions to medical needs and include traditional gases, such as oxygen and nitrous oxide, as well as gases with recently discovered roles as biological messenger molecules, such as carbon monoxide, nitric oxide and hydrogen sulphide. Medical gas therapy is a relatively unexplored field of medicine; however, a recent increasing in the number of publications on medical gas therapies clearly indicate that there are significant opportunities for use of gases as therapeutic tools for a variety of disease conditions. In this article, we review the recent advances in research on medical gases with antioxidant properties and discuss their clinical applications and therapeutic properties. PMID:19177183

  6. Therapeutics for cognitive aging

    PubMed Central

    Shineman, Diana W.; Salthouse, Timothy A.; Launer, Lenore J.; Hof, Patrick R.; Bartzokis, George; Kleiman, Robin; Luine, Victoria; Buccafusco, Jerry J.; Small, Gary W.; Aisen, Paul S.; Lowe, David A.; Fillit, Howard M.

    2011-01-01

    This review summarizes the scientific talks presented at the conference “Therapeutics for Cognitive Aging,” hosted by the New York Academy of Sciences and the Alzheimer’s Drug Discovery Foundation on May 15, 2009. Attended by scientists from industry and academia, as well as by a number of lay people—approximately 200 in all—the conference specifically tackled the many aspects of developing therapeutic interventions for cognitive impairment. Discussion also focused on how to define cognitive aging and whether it should be considered a treatable, tractable disease. PMID:20392284

  7. Advances in Therapeutic Cholangioscopy

    PubMed Central

    Moura, Renata Nobre; de Moura, Eduardo Guimarães Hourneaux

    2016-01-01

    Nowadays, cholangioscopy is an established modality in diagnostic and treatment of pancreaticobiliary diseases. The more widespread use and the recent development of new technologies and accessories had renewed the interest of endoscopic visualization of the biliary tract, increasing the range of indications and therapeutic procedures, such as diagnostic of indeterminate biliary strictures, lithotripsy of difficult bile duct stones, ablative techniques for intraductal malignancies, removal of foreign bodies and gallbladder drainage. These endoscopic interventions will probably be the last frontier in the near future. This paper presents the new advances in therapeutic cholangioscopy, focusing on the current clinical applications and on research areas. PMID:27403156

  8. DELIVERY OF THERAPEUTIC PROTEINS

    PubMed Central

    Pisal, Dipak S.; Kosloski, Matthew P.; Balu-Iyer, Sathy V.

    2009-01-01

    The safety and efficacy of protein therapeutics are limited by three interrelated pharmaceutical issues, in vitro and in vivo instability, immunogenicity and shorter half-lives. Novel drug modifications for overcoming these issues are under investigation and include covalent attachment of poly(ethylene glycol) (PEG), polysialic acid, or glycolic acid, as well as developing new formulations containing nanoparticulate or colloidal systems (e.g. liposomes, polymeric microspheres, polymeric nanoparticles). Such strategies have the potential to develop as next generation protein therapeutics. This review includes a general discussion on these delivery approaches. PMID:20049941

  9. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  10. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  11. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... INFORMATION: The NRC published a final rule in the Federal Register on June 7, 2013 (78 FR 34245), to make.... The final rule contained minor errors in grammar, punctuation, and referencing. This document corrects... specifying metric units. The final rule inadvertently included additional errors in grammar and...

  12. On prismatic corrections

    NASA Astrophysics Data System (ADS)

    Bartkowski, Zygmunt; Bartkowska, Janina

    2006-02-01

    In the prismatic corrections there are described the differences between the nominal and interior prisms, or tilts of the eye to fix straightforward (Augenausgleichbewegung). In the astigmatic corrections, if the prism doesn't lie in the principal sections of the cylinder, the directions of both events are different. In the corrections of the horizontal strabismus there appears the vertical component of the interior prism. The approximated formulae describing these phenomena are presented. The suitable setting can correct the quality of the vision in the important for the patient direction.

  13. 75 FR 68407 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... 67013, the Presidential Determination number should read ``2010-12'' (Presidential Sig.) [FR Doc. C1... Migration Needs Resulting from Violence in Kyrgyzstan Correction In Presidential document...

  14. Antibody Therapeutics in Oncology

    PubMed Central

    Wold, Erik D; Smider, Vaughn V; Felding, Brunhilde H

    2016-01-01

    One of the newer classes of targeted cancer therapeutics is monoclonal antibodies. Monoclonal antibody therapeutics are a successful and rapidly expanding drug class due to their high specificity, activity, favourable pharmacokinetics, and standardized manufacturing processes. Antibodies are capable of recruiting the immune system to attack cancer cells through complement-dependent cytotoxicity or antibody dependent cellular cytotoxicity. In an ideal scenario the initial tumor cell destruction induced by administration of a therapeutic antibody can result in uptake of tumor associated antigens by antigen-presenting cells, establishing a prolonged memory effect. Mechanisms of direct tumor cell killing by antibodies include antibody recognition of cell surface bound enzymes to neutralize enzyme activity and signaling, or induction of receptor agonist or antagonist activity. Both approaches result in cellular apoptosis. In another and very direct approach, antibodies are used to deliver drugs to target cells and cause cell death. Such antibody drug conjugates (ADCs) direct cytotoxic compounds to tumor cells, after selective binding to cell surface antigens, internalization, and intracellular drug release. Efficacy and safety of ADCs for cancer therapy has recently been greatly advanced based on innovative approaches for site-specific drug conjugation to the antibody structure. This technology enabled rational optimization of function and pharmacokinetics of the resulting conjugates, and is now beginning to yield therapeutics with defined, uniform molecular characteristics, and unprecedented promise to advance cancer treatment. PMID:27081677

  15. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  16. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  17. Model-Based Software Testing for Object-Oriented Software

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  18. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students learning…

  19. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  20. Emerging Mitochondrial Therapeutic Targets in Optic Neuropathies.

    PubMed

    Lopez Sanchez, M I G; Crowston, J G; Mackey, D A; Trounce, I A

    2016-09-01

    Optic neuropathies are an important cause of blindness worldwide. The study of the most common inherited mitochondrial optic neuropathies, Leber hereditary optic neuropathy (LHON) and autosomal dominant optic atrophy (ADOA) has highlighted a fundamental role for mitochondrial function in the survival of the affected neuron-the retinal ganglion cell. A picture is now emerging that links mitochondrial dysfunction to optic nerve disease and other neurodegenerative processes. Insights gained from the peculiar susceptibility of retinal ganglion cells to mitochondrial dysfunction are likely to inform therapeutic development for glaucoma and other common neurodegenerative diseases of aging. Despite it being a fast-evolving field of research, a lack of access to human ocular tissues and limited animal models of mitochondrial disease have prevented direct retinal ganglion cell experimentation and delayed the development of efficient therapeutic strategies to prevent vision loss. Currently, there are no approved treatments for mitochondrial disease, including optic neuropathies caused by primary or secondary mitochondrial dysfunction. Recent advances in eye research have provided important insights into the molecular mechanisms that mediate pathogenesis, and new therapeutic strategies including gene correction approaches are currently being investigated. Here, we review the general principles of mitochondrial biology relevant to retinal ganglion cell function and provide an overview of the major optic neuropathies with mitochondrial involvement, LHON and ADOA, whilst highlighting the emerging link between mitochondrial dysfunction and glaucoma. The pharmacological strategies currently being trialed to improve mitochondrial dysfunction in these optic neuropathies are discussed in addition to emerging therapeutic approaches to preserve retinal ganglion cell function. PMID:27288727

  1. Therapeutic Hypothermia for Neuroprotection

    PubMed Central

    Karnatovskaia, Lioudmila V.; Wartenberg, Katja E.

    2014-01-01

    The earliest recorded application of therapeutic hypothermia in medicine spans about 5000 years; however, its use has become widespread since 2002, following the demonstration of both safety and efficacy of regimens requiring only a mild (32°C-35°C) degree of cooling after cardiac arrest. We review the mechanisms by which hypothermia confers neuroprotection as well as its physiological effects by body system and its associated risks. With regard to clinical applications, we present evidence on the role of hypothermia in traumatic brain injury, intracranial pressure elevation, stroke, subarachnoid hemorrhage, spinal cord injury, hepatic encephalopathy, and neonatal peripartum encephalopathy. Based on the current knowledge and areas undergoing or in need of further exploration, we feel that therapeutic hypothermia holds promise in the treatment of patients with various forms of neurologic injury; however, additional quality studies are needed before its true role is fully known. PMID:24982721

  2. Multistage vector (MSV) therapeutics.

    PubMed

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-12-10

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers. PMID:26264836

  3. Therapeutic antibodies against cancer

    PubMed Central

    Adler, Mark J.; Dimitrov, Dimiter S.

    2012-01-01

    Antibody-based therapeutics against cancer are highly successful in clinic and currently enjoy unprecedented recognition of their potential; 13 monoclonal antibodies (mAbs) have been approved for clinical use in the European Union and in the United States (one, mylotarg, was withdrawn from market in 2010). Three of the mAbs (bevacizumab, rituximab, trastuzumab) are in the top six selling protein therapeutics with sales in 2010 of more than $5 bln each. Hundreds of mAbs including bispecific mAbs and multispecific fusion proteins, mAbs conjugated with small molecule drugs and mAbs with optimized pharmacokinetics are in clinical trials. However, challenges remain and it appears that deeper understanding of mechanisms is needed to overcome major problems including resistance to therapy, access to targets, complexity of biological systems and individual variations. PMID:22520975

  4. Strategies for therapeutic hypometabothermia

    PubMed Central

    Liu, Shimin; Chen, Jiang-Fan

    2013-01-01

    Although therapeutic hypothermia and metabolic suppression have shown robust neuroprotection in experimental brain ischemia, systemic complications have limited their use in treating acute stroke patients. The core temperature and basic metabolic rate are tightly regulated and maintained in a very stable level in mammals. Simply lowering body temperature or metabolic rate is actually a brutal therapy that may cause more systemic as well as regional problems other than providing protection. These problems are commonly seen in hypothermia and barbiturate coma. The main innovative concept of this review is to propose thermogenically optimal and synergistic reduction of core temperature and metabolic rate in therapeutic hypometabothermia using novel and clinically practical approaches. When metabolism and body temperature are reduced in a systematically synergistic manner, the outcome will be maximal protection and safe recovery, which happen in natural process, such as in hibernation, daily torpor and estivation. PMID:24179563

  5. Polycyclic peptide therapeutics.

    PubMed

    Baeriswyl, Vanessa; Heinis, Christian

    2013-03-01

    Owing to their excellent binding properties, high stability, and low off-target toxicity, polycyclic peptides are an attractive molecule format for the development of therapeutics. Currently, only a handful of polycyclic peptides are used in the clinic; examples include the antibiotic vancomycin, the anticancer drugs actinomycin D and romidepsin, and the analgesic agent ziconotide. All clinically used polycyclic peptide drugs are derived from natural sources, such as soil bacteria in the case of vancomycin, actinomycin D and romidepsin, or the venom of a fish-hunting coil snail in the case of ziconotide. Unfortunately, nature provides peptide macrocyclic ligands for only a small fraction of therapeutic targets. For the generation of ligands of targets of choice, researchers have inserted artificial binding sites into natural polycyclic peptide scaffolds, such as cystine knot proteins, using rational design or directed evolution approaches. More recently, large combinatorial libraries of genetically encoded bicyclic peptides have been generated de novo and screened by phage display. In this Minireview, the properties of existing polycyclic peptide drugs are discussed and related to their interesting molecular architectures. Furthermore, technologies that allow the development of unnatural polycyclic peptide ligands are discussed. Recent application of these technologies has generated promising results, suggesting that polycyclic peptide therapeutics could potentially be developed for a broad range of diseases. PMID:23355488

  6. Global orbit corrections

    SciTech Connect

    Symon, K.

    1987-11-01

    There are various reasons for preferring local (e.g., three bump) orbit correction methods to global corrections. One is the difficulty of solving the mN equations for the required mN correcting bumps, where N is the number of superperiods and m is the number of bumps per superperiod. The latter is not a valid reason for avoiding global corrections, since, we can take advantage of the superperiod symmetry to reduce the mN simultaneous equations to N separate problems, each involving only m simultaneous equations. Previously, I have shown how to solve the general problem when the machine contains unknown magnet errors of known probability distribution; we made measurements of known precision of the orbit displacements at a set of points, and we wish to apply correcting bumps to minimize the weighted rms orbit deviations. In this report, we will consider two simpler problems, using similar methods. We consider the case when we make M beam position measurements per superperiod, and we wish to apply an equal number M of orbit correcting bumps to reduce the measured position errors to zero. We also consider the problem when the number of correcting bumps is less than the number of measurements, and we wish to minimize the weighted rms position errors. We will see that the latter problem involves solving equations of a different form, but involving the same matrices as the former problem.

  7. Model-Based Reasoning in Humans Becomes Automatic with Training

    PubMed Central

    Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J.

    2015-01-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load—a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  8. Corrected Age for Preemies

    MedlinePlus

    ... Prenatal Baby Bathing & Skin Care Breastfeeding Crying & Colic Diapers & Clothing Feeding & Nutrition Preemie Sleep Teething & Tooth Care Toddler Preschool Gradeschool Teen Young Adult Healthy Children > Ages & Stages > Baby > Preemie > Corrected Age ...

  9. 75 FR 68409 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ..., the Presidential Determination number should read ``2010-14'' (Presidential Sig.) [FR Doc. C1-2010... Migration Needs Resulting From Flooding In Pakistan Correction In Presidential document 2010-27673...

  10. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  11. Model-based HSF using by target point control function

    NASA Astrophysics Data System (ADS)

    Kim, Seongjin; Do, Munhoe; An, Yongbae; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu

    2015-03-01

    As the technology node shrinks, ArF Immersion reaches the limitation of wafer patterning, furthermore weak point during the mask processing is generated easily. In order to make strong patterning result, the design house conducts lithography rule checking (LRC). Despite LRC processing, we found the weak point at the verification stage of optical proximity correction (OPC). It is called the hot spot point (HSP). In order to fix the HSP, many studies have been performed. One of the most general hot spot fixing (HSF) methods is that the modification bias which consists of "Line-Resizing" and "Space-Resizing". In addition to the general rule biasing method, resolution enhancement techniques (RET) which includes the inverse lithography technology (ILT) and model based assist feature (MBAF) have been adapted to remove the hot spot and to maximize the process window. If HSP is found during OPC verification stage, various HSF methods can be applied. However, HSF process added on regular OPC procedure makes OPC turn-around time (TAT) increased. In this paper, we introduce a new HSF method that is able to make OPC TAT shorter than the common HSF method. The new HSF method consists of two concepts. The first one is that OPC target point is controlled to fix HSP. Here, the target point should be moved to optimum position at where the edge placement error (EPE) can be 0 at critical points. Many parameters such as a model accuracy or an OPC recipe become the cause of larger EPE. The second one includes controlling of model offset error through target point adjustment. Figure 1 shows the case EPE is not 0. It means that the simulation contour was not targeted well after OPC process. On the other hand, Figure 2 shows the target point is moved -2.5nm by using target point control function. As a result, simulation contour is matched to the original layout. This function can be powerfully adapted to OPC procedure of memory and logic devices.

  12. Adaptable DC offset correction

    NASA Technical Reports Server (NTRS)

    Golusky, John M. (Inventor); Muldoon, Kelly P. (Inventor)

    2009-01-01

    Methods and systems for adaptable DC offset correction are provided. An exemplary adaptable DC offset correction system evaluates an incoming baseband signal to determine an appropriate DC offset removal scheme; removes a DC offset from the incoming baseband signal based on the appropriate DC offset scheme in response to the evaluated incoming baseband signal; and outputs a reduced DC baseband signal in response to the DC offset removed from the incoming baseband signal.

  13. Psychodynamic Perspective on Therapeutic Boundaries

    PubMed Central

    Bridges, Nancy A.

    1999-01-01

    Discussion of boundaries in therapeutic work most often focuses on boundary maintenance, risk management factors, and boundary violations. The psychodynamic meaning and clinical management of boundaries in therapeutic relationships remains a neglected area of discourse. Clinical vignettes will illustrate a psychodynamic, developmental-relational perspective using boundary dilemmas to deepen and advance the therapeutic process. This article contributes to the dialogue about the process of making meaning and constructing therapeutically useful and creative boundaries that further the psychotherapeutic process. PMID:10523432

  14. Cystic Fibrosis Therapeutics

    PubMed Central

    Ramsey, Bonnie W.

    2013-01-01

    A great deal of excitement and hope has followed the successful trials and US Food and Drug Administration approval of the drug ivacaftor (Kalydeco), the first therapy available that targets the underlying defect that causes cystic fibrosis (CF). Although this drug has currently demonstrated a clinical benefit for a small minority of the CF population, the developmental pathway established by ivacaftor paves the way for other CF transmembrane conductance regulator (CFTR) modulators that may benefit many more patients. In addition to investigating CFTR modulators, researchers are actively developing numerous other innovative CF therapies. In this review, we use the catalog of treatments currently under evaluation with the support of the Cystic Fibrosis Foundation, known as the Cystic Fibrosis Foundation Therapeutics Pipeline, as a platform to discuss the variety of candidate treatments for CF lung disease that promise to improve CF care. Many of these approaches target the individual components of the relentless cycle of airway obstruction, inflammation, and infection characteristic of lung disease in CF, whereas others are aimed directly at the gene defect, or the resulting dysfunctional protein, that instigates this cycle. We discuss how new findings from the laboratory have informed not only the development of novel therapeutics, but also the rationales for their use and the outcomes used to measure their effects. By reviewing the breadth of candidate treatments currently in development, as well as the recent progress in CF therapies reflected by the evolution of the therapeutics pipeline over the past few years, we hope to build upon the optimism and anticipation generated by the recent success of Kalydeco. PMID:23276843

  15. Frankincense--therapeutic properties.

    PubMed

    Al-Yasiry, Ali Ridha Mustafa; Kiczorowska, Bożena

    2016-01-01

    Recently, increasing interest in natural dietary and therapeutic preparations used as dietary supplements has been observed. One of them is frankincense. This traditional medicine of the East is believed to have anti-inflammatory, expectorant, antiseptic, and even anxiolytic and anti-neurotic effects. The present study aims to verify the reported therapeutic properties of Boswellia resin and describe its chemical composition based on available scientific studies. The main component of frankincense is oil (60%). It contains mono- (13%) and diterpenes (40%) as well as ethyl acetate (21.4%), octyl acetate (13.4%) and methylanisole (7.6%). The highest biological activity among terpenes is characteristic of 11-keto-ß-acetyl-beta-boswellic acid, acetyl-11-keto-ß-boswellic acid and acetyl-α-boswellic acid. Contemporary studies have shown that resin indeed has an analgesic, tranquilising and anti-bacterial effects. From the point of view of therapeutic properties, extracts from Boswellia serrata and Boswellia carterii are reported to be particularly useful. They reduce inflammatory conditions in the course of rheumatism by inhibiting leukocyte elastase and degrading glycosaminoglycans. Boswellia preparations inhibit 5-lipoxygenase and prevent the release of leukotrienes, thus having an anti-inflammatory effect in ulcerative colitis, irritable bowel syndrome, bronchitis and sinusitis. Inhalation and consumption of Boswellia olibanum reduces the risk of asthma. In addition, boswellic acids have an antiproliferative effect on tumours. They inhibit proliferation of tumour cells of the leukaemia and glioblastoma subset. They have an anti-tumour effect since they inhibit topoisomerase I and II-alpha and stimulate programmed cell death (apoptosis). PMID:27117114

  16. Revitalizing Psychiatric Therapeutics

    PubMed Central

    Hyman, Steven E

    2014-01-01

    Despite high prevalence and enormous unmet medical need, the pharmaceutical industry has recently de-emphasized neuropsychiatric disorders as ‘too difficult' a challenge to warrant major investment. Here I describe major obstacles to drug discovery and development including a lack of new molecular targets, shortcomings of current animal models, and the lack of biomarkers for clinical trials. My major focus, however, is on new technologies and scientific approaches to neuropsychiatric disorders that give promise for revitalizing therapeutics and may thus answer industry's concerns. PMID:24317307

  17. [Is therapeutic deadlock inevitable?].

    PubMed

    Vignat, Jean-Pierre

    2016-01-01

    Many long-term treatments appear to be an expression of therapeutic deadlock. The situation leads to a questioning of the concept of chronicity and the identification of the determining factors of situations which are apparently blocked, marked by the search for solutions taking a back seat to the taking of action. The interaction between patients' mental apparatus and the care apparatus lies at the heart of the question, interpreted from an institutional, collective and individual perspective, supported by the clinical and psychopathological approach, and the return to the prioritisation of the thought. PMID:27389427

  18. Telomerase and cancer therapeutics.

    PubMed

    Harley, Calvin B

    2008-03-01

    Telomerase is an attractive cancer target as it appears to be required in essentially all tumours for immortalization of a subset of cells, including cancer stem cells. Moreover, differences in telomerase expression, telomere length and cell kinetics between normal and tumour tissues suggest that targeting telomerase would be relatively safe. Clinical trials are ongoing with a potent and specific telomerase inhibitor, GRN163L, and with several versions of telomerase therapeutic vaccines. The prospect of adding telomerase-based therapies to the growing list of new anticancer products is promising, but what are the advantages and limitations of different approaches, and which patients are the most likely to respond? PMID:18256617

  19. [Achievement of therapeutic objectives].

    PubMed

    Mantilla, Teresa

    2014-07-01

    Therapeutic objectives for patients with atherogenic dyslipidemia are achieved by improving patient compliance and adherence. Clinical practice guidelines address the importance of treatment compliance for achieving objectives. The combination of a fixed dose of pravastatin and fenofibrate increases the adherence by simplifying the drug regimen and reducing the number of daily doses. The good tolerance, the cost of the combination and the possibility of adjusting the administration to the patient's lifestyle helps achieve the objectives for these patients with high cardiovascular risk. PMID:25043543

  20. Therapeutic Endoscopic Ultrasound

    PubMed Central

    Cheriyan, Danny

    2015-01-01

    Endoscopic ultrasound (EUS) technology has evolved dramatically over the past 20 years, from being a supplementary diagnostic aid available only in large medical centers to being a core diagnostic and therapeutic tool that is widely available. Although formal recommendations and practice guidelines have not been developed, there are considerable data supporting the use of EUS for its technical accuracy in diagnosing pancreaticobiliary and gastrointestinal pathology. Endosonography is now routine practice not only for pathologic diagnosis and tumor staging but also for drainage of cystic lesions and celiac plexus neurolysis. In this article, we cover the use of EUS in biliary and pancreatic intervention, ablative therapy, enterostomy, and vascular intervention. PMID:27118942

  1. The Therapeutic Roller Coaster

    PubMed Central

    CHU, JAMES A.

    1992-01-01

    Survivors of severe childhood abuse often encounter profound difficulties. In addition to posttraumatic and dissociative symptomatology, abuse survivors frequently have characterologic problems, particularly regarding self-care and maintaining relationships. Backgrounds of abuse, abandonment, and betrayal are often recapitulated and reenacted in therapy, making the therapeutic experience arduous and confusing for therapists and patients. Efforts must be directed at building an adequate psychotherapeutic foundation before undertaking exploration and abreaction of past traumatic experiences. This discussion sets out a model for treatment of childhood abuse survivors, describing stages of treatment and suggesting interventions. Common treatment dilemmas or "traps" are discussed, with recommendations for their resolution. PMID:22700116

  2. Distributed real-time model-based diagnosis

    NASA Technical Reports Server (NTRS)

    Barrett, A. C.; Chung, S. H.

    2003-01-01

    This paper presents an approach to onboard anomaly diagnosis that combines the simplicity and real-time guarantee of a rule-based diagnosis system with the specification ease and coverage guarantees of a model-based diagnosis system.

  3. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  4. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  5. Reduced model-based decision-making in schizophrenia.

    PubMed

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record PMID:27175984

  6. Winnicott's therapeutic consultations revisited.

    PubMed

    Brafman, A H

    1997-08-01

    Winnicott described in his book 'Therapeutic Consultations' (1971) how a diagnostic assessment of a referred child developed into a fruitful therapeutic intervention when he was able to discover the unconscious fantasy that underlay the child's symptoms. Because these were children who were, essentially, developing normally, he used the word 'knot' to depict the obstacle the child had met. Any conflicts the parents might have were not explored in that context. This work present cases in which child and parents are seen together for the diagnostic assessment. The child's feelings about his world and his difficulties are explored through a variety of techniques including drawings. In the same interview, an analytic enquiry into the parents' history and also their views of the child reveals how the child's fantasies and the parents' past experiences interact and create a mutually reinforcing vicious circle. In other words, the 'knot' involves all of them. If the child's unconscious fantasy can be verbalised and if the parents are able to approach the child in a manner that acknowledges the child's real needs, the 'knot' disappears and normal development can be resumed. PMID:9306188

  7. Engineering therapeutic protein disaggregases.

    PubMed

    Shorter, James

    2016-05-15

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), and Alzheimer's disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  8. Mechanisms of Plasma Therapeutics

    NASA Astrophysics Data System (ADS)

    Graves, David

    2015-09-01

    In this talk, I address research directed towards biomedical applications of atmospheric pressure plasma such as sterilization, surgery, wound healing and anti-cancer therapy. The field has seen remarkable growth in the last 3-5 years, but the mechanisms responsible for the biomedical effects have remained mysterious. It is known that plasmas readily create reactive oxygen species (ROS) and reactive nitrogen species (RNS). ROS and RNS (or RONS), in addition to a suite of other radical and non-radical reactive species, are essential actors in an important sub-field of aerobic biology termed ``redox'' (or oxidation-reduction) biology. It is postulated that cold atmospheric plasma (CAP) can trigger a therapeutic shielding response in tissue in part by creating a time- and space-localized, burst-like form of oxy-nitrosative stress on near-surface exposed cells through the flux of plasma-generated RONS. RONS-exposed surface layers of cells communicate to the deeper levels of tissue via a form of the ``bystander effect,'' similar to responses to other forms of cell stress. In this proposed model of CAP therapeutics, the plasma stimulates a cellular survival mechanism through which aerobic organisms shield themselves from infection and other challenges.

  9. Epigenomes as therapeutic targets.

    PubMed

    Hamm, Christopher A; Costa, Fabricio F

    2015-07-01

    Epigenetics is a molecular phenomenon that pertains to heritable changes in gene expression that do not involve changes in the DNA sequence. Epigenetic modifications in a whole genome, known as the epigenome, play an essential role in the regulation of gene expression in both normal development and disease. Traditional epigenetic changes include DNA methylation and histone modifications. Recent evidence reveals that other players, such as non-coding RNAs, may have an epigenetic regulatory role. Aberrant epigenetic signaling is becoming to be known as a central component of human disease, and the reversible nature of the epigenetic modifications provides an exciting opportunity for the development of clinically relevant therapeutics. Current epigenetic therapies provide a clinical benefit through disrupting DNA methyltransferases or histone deacetylases. However, the emergence of next-generation epigenetic therapies provides an opportunity to more effectively disrupt epigenetic disease states. Novel epigenetic therapies may improve drug targeting and drug delivery, optimize dosing schedules, and improve the efficacy of preexisting treatment modalities (chemotherapy, radiation, and immunotherapy). This review discusses the epigenetic mechanisms that contribute to the disease, available epigenetic therapies, epigenetic therapies currently in development, and the potential future use of epigenetic therapeutics in a clinical setting. PMID:25797698

  10. AMUM LECTURE: Therapeutic ultrasound

    NASA Astrophysics Data System (ADS)

    Crum, Lawrence A.

    2004-01-01

    The use of ultrasound in medicine is now quite commonplace, especially with the recent introduction of small, portable and relatively inexpensive, hand-held diagnostic imaging devices. Moreover, ultrasound has expanded beyond the imaging realm, with methods and applications extending to novel therapeutic and surgical uses. These applications broadly include: tissue ablation, acoustocautery, lipoplasty, site-specific and ultrasound mediated drug activity, extracorporeal lithotripsy, and the enhancement of natural physiological functions such as wound healing and tissue regeneration. A particularly attractive aspect of this technology is that diagnostic and therapeutic systems can be combined to produce totally non-invasive, imageguided therapy. This general lecture will review a number of these exciting new applications of ultrasound and address some of the basic scientific questions and future challenges in developing these methods and technologies for general use in our society. We shall particularly emphasize the use of High Intensity Focused Ultrasound (HIFU) in the treatment of benign and malignant tumors as well as the introduction of acoustic hemostasis, especially in organs which are difficult to treat using conventional medical and surgical techniques.

  11. Pharmacogenetics approach to therapeutics.

    PubMed

    Koo, Seok Hwee; Lee, Edmund Jon Deoon

    2006-01-01

    1. Pharmacogenetics refers to the study of genetically controlled variations in drug response. Functional variants caused by single nucleotide polymorphisms (SNPs) in genes encoding drug-metabolising enzymes, transporters, ion channels and drug receptors have been known to be associated with interindividual and interethnic variation in drug response. Genetic variations in these genes play a role in influencing the efficacy and toxicity of medications. 2. Rapid, precise and cost-effective high-throughput technological platforms are essential for performing large-scale mutational analysis of genetic markers involved in the aetiology of variable responses to drug therapy. 3. The application of a pharmacogenetics approach to therapeutics in general clinical practice is still far from being achieved today owing to various constraints, such as limited accessibility of technology, inadequate knowledge, ambiguity of the role of variants and ethical concerns. 4. Drug actions are determined by the interplay of several genes encoding different proteins involved in various biochemical pathways. With rapidly emerging SNP discovery technological platforms and widespread knowledge on the role of SNPs in disease susceptibility and variability in drug response, the pharmacogenetics approach to therapeutics is anticipated to take off in the not-too-distant future. This will present profound clinical, economic and social implications for health care. PMID:16700889

  12. Person-centered Therapeutics

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    A clinician’s effectiveness in treatment depends substantially on his or her attitude toward -- and understanding of -- the patient as a person endowed with self-awareness and the will to direct his or her own future. The assessment of personality in the therapeutic encounter is a crucial foundation for forming an effective working alliance with shared goals. Helping a person to reflect on their personality provides a mirror image of their strengths and weaknesses in adapting to life’s many challenges. The Temperament and Character Inventory (TCI) provides an effective way to describe personality thoroughly and to predict both the positive and negative aspects of health. Strengths and weaknesses in TCI personality traits allow strong predictions of individual differences of all aspects of well-being. Diverse therapeutic techniques, such as diet, exercise, mood self-regulation, meditation, or acts of kindness, influence health and personality development in ways that are largely indistinguishable from one another or from effective allopathic treatments. Hence the development of well-being appears to be the result of activating a synergistic set of mechanisms of well-being, which are expressed as fuller functioning, plasticity, and virtue in adapting to life’s challenges PMID:26052429

  13. Engineering therapeutic protein disaggregases

    PubMed Central

    Shorter, James

    2016-01-01

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson’s disease (PD), and Alzheimer’s disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  14. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  15. Peteye detection and correction

    NASA Astrophysics Data System (ADS)

    Yen, Jonathan; Luo, Huitao; Tretter, Daniel

    2007-01-01

    Redeyes are caused by the camera flash light reflecting off the retina. Peteyes refer to similar artifacts in the eyes of other mammals caused by camera flash. In this paper we present a peteye removal algorithm for detecting and correcting peteye artifacts in digital images. Peteye removal for animals is significantly more difficult than redeye removal for humans, because peteyes can be any of a variety of colors, and human face detection cannot be used to localize the animal eyes. In many animals, including dogs and cats, the retina has a special reflective layer that can cause a variety of peteye colors, depending on the animal's breed, age, or fur color, etc. This makes the peteye correction more challenging. We have developed a semi-automatic algorithm for peteye removal that can detect peteyes based on the cursor position provided by the user and correct them by neutralizing the colors with glare reduction and glint retention.

  16. Aureolegraph internal scattering correction.

    PubMed

    DeVore, John; Villanucci, Dennis; LePage, Andrew

    2012-11-20

    Two methods of determining instrumental scattering for correcting aureolegraph measurements of particulate solar scattering are presented. One involves subtracting measurements made with and without an external occluding ball and the other is a modification of the Langley Plot method and involves extrapolating aureolegraph measurements collected through a large range of solar zenith angles. Examples of internal scattering correction determinations using the latter method show similar power-law dependencies on scattering, but vary by roughly a factor of 8 and suggest that changing aerosol conditions during the determinations render this method problematic. Examples of corrections of scattering profiles using the former method are presented for a range of atmospheric particulate layers from aerosols to cumulus and cirrus clouds. PMID:23207299

  17. Hypoxic Conditioning as a New Therapeutic Modality

    PubMed Central

    Verges, Samuel; Chacaroun, Samarmar; Godin-Ribuot, Diane; Baillieul, Sébastien

    2015-01-01

    Preconditioning refers to a procedure by which a single noxious stimulus below the threshold of damage is applied to the tissue in order to increase resistance to the same or even different noxious stimuli given above the threshold of damage. Hypoxic preconditioning relies on complex and active defenses that organisms have developed to counter the adverse consequences of oxygen deprivation. The protection it confers against ischemic attack for instance as well as the underlying biological mechanisms have been extensively investigated in animal models. Based on these data, hypoxic conditioning (consisting in recurrent exposure to hypoxia) has been suggested a potential non-pharmacological therapeutic intervention to enhance some physiological functions in individuals in whom acute or chronic pathological events are anticipated or existing. In addition to healthy subjects, some benefits have been reported in patients with cardiovascular and pulmonary diseases as well as in overweight and obese individuals. Hypoxic conditioning consisting in sessions of intermittent exposure to moderate hypoxia repeated over several weeks may induce hematological, vascular, metabolic, and neurological effects. This review addresses the existing evidence regarding the use of hypoxic conditioning as a potential therapeutic modality, and emphasizes on many remaining issues to clarify and future researches to be performed in the field. PMID:26157787

  18. Target Mass Corrections Revisited

    SciTech Connect

    W. Melnitchouk; F. Steffens

    2006-03-07

    We propose a new implementation of target mass corrections to nucleon structure functions which, unlike existing treatments, has the correct kinematic threshold behavior at finite Q{sup 2} in the x {yields} 1 limit. We illustrate the differences between the new approach and existing prescriptions by considering specific examples for the F{sub 2} and F{sub L} structure functions, and discuss the broader implications of our results, which call into question the notion of universal parton distribution at finite Q{sup 2}.

  19. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Optical measurements of range and elevation angle are distorted by the earth's atmosphere. High precision refraction correction equations are presented which are ideally suited for surveying because their inputs are optically measured range and optically measured elevation angle. The outputs are true straight line range and true geometric elevation angle. The 'short distances' used in surveying allow the calculations of true range and true elevation angle to be quickly made using a programmable pocket calculator. Topics covered include the spherical form of Snell's Law; ray path equations; and integrating the equations. Short-, medium-, and long-range refraction corrections are presented in tables.

  20. Correction coil cable

    DOEpatents

    Wang, S.T.

    1994-11-01

    A wire cable assembly adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies for the Superconducting Super Collider. The correction coil cables have wires collected in wire array with a center rib sandwiched therebetween to form a core assembly. The core assembly is surrounded by an assembly housing having an inner spiral wrap and a counter wound outer spiral wrap. An alternate embodiment of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable on a particle tube in a particle tube assembly. 7 figs.

  1. Mitochondrial Energetics and Therapeutics

    PubMed Central

    Wallace, Douglas C.; Fan, Weiwei; Procaccio, Vincent

    2011-01-01

    Mitochondrial dysfunction has been linked to a wide range of degenerative and metabolic diseases, cancer, and aging. All these clinical manifestations arise from the central role of bioenergetics in cell biology. Although genetic therapies are maturing as the rules of bioenergetic genetics are clarified, metabolic therapies have been ineffectual. This failure results from our limited appreciation of the role of bioenergetics as the interface between the environment and the cell. A systems approach, which, ironically, was first successfully applied over 80 years ago with the introduction of the ketogenic diet, is required. Analysis of the many ways that a shift from carbohydrate glycolytic metabolism to fatty acid and ketone oxidative metabolism may modulate metabolism, signal transduction pathways, and the epigenome gives us an appreciation of the ketogenic diet and the potential for bioenergetic therapeutics. PMID:20078222

  2. Principles of therapeutics.

    PubMed

    Miller, T R

    1992-12-01

    Topical administration of drugs is the treatment of choice for diseases of the anterior segment. Drug levels attained by this means are usually of short duration, however, necessitating frequent therapy or continuous perfusion if prolonged drug levels are required. A drug-delivery device (collagen shield or contact lens) or subconjunctival injections can be used to augment topical therapy if frequent treatment is not possible. Subconjunctival injections are recommended for drugs that have low solubility and, hence, low corneal penetration. Retrobulbar injections are seldom indicated, except for regional anesthesia. Systemic administration is useful for anti-inflammatory therapy but it may be difficult to establish therapeutic levels of antibiotic agents in the eye because of the blood-ocular barrier. In severe cases, intraocular injection may be required. PMID:1458325

  3. Aptamers in Therapeutics

    PubMed Central

    2016-01-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  4. Antibody Engineering and Therapeutics

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Breden, Felix; Scott, Jamie K; Sok, Devin; Pauthner, Matthias; Reichert, Janice M; Helguera, Gustavo; Andrabi, Raiees; Mabry, Robert; Bléry, Mathieu; Voss, James E; Laurén, Juha; Abuqayyas, Lubna; Barghorn, Stefan; Ben-Jacob, Eshel; Crowe, James E; Huston, James S; Johnston, Stephen Albert; Krauland, Eric; Lund-Johansen, Fridtjof; Marasco, Wayne A; Parren, Paul WHI; Xu, Kai Y

    2014-01-01

    The 24th Antibody Engineering & Therapeutics meeting brought together a broad range of participants who were updated on the latest advances in antibody research and development. Organized by IBC Life Sciences, the gathering is the annual meeting of The Antibody Society, which serves as the scientific sponsor. Preconference workshops on 3D modeling and delineation of clonal lineages were featured, and the conference included sessions on a wide variety of topics relevant to researchers, including systems biology; antibody deep sequencing and repertoires; the effects of antibody gene variation and usage on antibody response; directed evolution; knowledge-based design; antibodies in a complex environment; polyreactive antibodies and polyspecificity; the interface between antibody therapy and cellular immunity in cancer; antibodies in cardiometabolic medicine; antibody pharmacokinetics, distribution and off-target toxicity; optimizing antibody formats for immunotherapy; polyclonals, oligoclonals and bispecifics; antibody discovery platforms; and antibody-drug conjugates. PMID:24589717

  5. Microfabricated therapeutic actuators

    SciTech Connect

    Lee, Abraham P.; Northrup, M. Allen; Ciarlo, Dino R.; Krulevitch, Peter A.; Benett, William J.

    1999-01-01

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use.

  6. Microfabricated therapeutic actuators

    DOEpatents

    Lee, A.P.; Northrup, M.A.; Ciarlo, D.R.; Krulevitch, P.A.; Benett, W.J.

    1999-06-15

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use. 8 figs.

  7. Homocystinuria: Therapeutic approach.

    PubMed

    Kumar, Tarun; Sharma, Gurumayum Suraj; Singh, Laishram Rajendrakumar

    2016-07-01

    Homocystinuria is a disorder of sulfur metabolism pathway caused by deficiency of cystathionine β-synthase (CBS). It is characterized by increased accumulation of homocysteine (Hcy) in the cells and plasma. Increased homocysteine results in various vascular and neurological complications. Present strategies to lower cellular and plasma homocysteine levels include vitamin B6 intake, dietary methionine restriction, betaine supplementation, folate and vitamin B12 administration. However, these strategies are inefficient for treatment of homocystinuria. In recent years, advances have been made towards developing new strategies to treat homocystinuria. These mainly include functional restoration to mutant CBS, enhanced clearance of Hcy from the body, prevention of N-homocysteinylation-induced toxicity and inhibition of homocysteine-induced oxidative stress. In this review, we have exclusively discussed the recent advances that have been achieved towards the treatment of homocystinuria. The review is an attempt to help clinicians in developing effective therapeutic strategies and designing novel drugs against homocystinuria. PMID:27059523

  8. Antioxidant therapeutics: Pandora's box.

    PubMed

    Day, Brian J

    2014-01-01

    Evolution has favored the utilization of dioxygen (O2) in the development of complex multicellular organisms. O2 is actually a toxic mutagenic gas that is highly oxidizing and combustible. It is thought that plants are largely to blame for polluting the earth's atmosphere with O2 owing to the development of photosynthesis by blue-green algae over 2 billion years ago. The rise of the plants and atmospheric O2 levels placed evolutionary stress on organisms to adapt or become extinct. This implies that all the surviving creatures on our planet are mutants that have adapted to the "abnormal biology" of O2. Much of the adaptation to the presence of O2 in biological systems comes from well-coordinated antioxidant and repair systems that focus on converting O2 to its most reduced form, water (H2O), and the repair and replacement of damaged cellular macromolecules. Biological systems have also harnessed O2's reactive properties for energy production, xenobiotic metabolism, and host defense and as a signaling messenger and redox modulator of a number of cell signaling pathways. Many of these systems involve electron transport systems and offer many different mechanisms by which antioxidant therapeutics can alternatively produce an antioxidant effect without directly scavenging oxygen-derived reactive species. It is likely that each agent will have a different set of mechanisms that may change depending on the model of oxidative stress, organ system, or disease state. An important point is that all biological processes of aerobes have coevolved with O2 and this creates a Pandora's box for trying to understand the mechanism(s) of action of antioxidants being developed as therapeutic agents. PMID:23856377

  9. GTI-2040. Lorus Therapeutics.

    PubMed

    Orr, R M

    2001-10-01

    Loris Therapeutics (formerly GeneSense Therapeutics) is developing the antisense oligonucleotide GTI-2040, directed against the R2 component of ribonucleotide reductase, for the potential treatment of cancer [348194]. It is in phase I/II trials [353796] and Lorus had anticipated phase II trials would be initiated in July 2001. By August 2001, GTI-2040 was undergoing a phase II trial as a monotherapy for the potential treatment of renal cell carcinoma, and was about to enter a phase II combination study for this indication with capecitabine (Hoffmann-La Roche). At this time, the company was also planning a phase II trial to study the drug's potential in the treatment of colorectal cancer [418739]. GTI-2040 has been tested in nine different tumor models, including tumors derived from colon, liver, lung, breast, kidney and ovary. Depending on the tumor model, significant inhibition of tumor growth, disease stabilization and dramatic tumor regressions was observed [347683]. Lorus filed an IND to commence phase I/II trials with GTI-2040 in the US in November 1999 [347683], and received approval for the trials in December 1999 [349623]. As of January 2000, these trials had commenced at the University of Chicago Cancer Research Center; it was reported in February 2000 that dosing to date had been well tolerated with no apparent safety concerns [357449]. Lorus has entered into a strategic supply alliance with Proligo to provide the higher volumes of drug product required for the planned multiple phase II trials [385976]. In February 1998, Genesense (now Lorus) received patent WO-09805769. Loris also received a patent (subsequently identified as WO-00047733) from the USPTO in January 2000, entitled 'Antitumor antisense sequences directed against components of ribonucleotide reductase' covering the design and use of unique antisense anticancer drugs, including GTI-2040 and GTI-2501 [353538]. PMID:11890366

  10. Atmospheric Correction Algorithm for Hyperspectral Imagery

    SciTech Connect

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolute calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.

  11. The Digital Correction Unit: A data correction/compaction chip

    SciTech Connect

    MacKenzie, S.; Nielsen, B.; Paffrath, L.; Russell, J.; Sherden, D.

    1986-10-01

    The Digital Correction Unit (DCU) is a semi-custom CMOS integrated circuit which corrects and compacts data for the SLD experiment. It performs a piece-wise linear correction to data, and implements two separate compaction algorithms. This paper describes the basic functionality of the DCU and its correction and compaction algorithms.

  12. Counselor Education for Corrections.

    ERIC Educational Resources Information Center

    Parsigian, Linda

    Counselor education programs most often prepare their graduates to work in either a school setting, anywhere from the elementary level through higher education, or a community agency. There is little indication that counselor education programs have seriously undertaken the task of training counselors to enter the correctional field. If…

  13. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  14. Correction and Communicative Activity.

    ERIC Educational Resources Information Center

    Williams, Huw P.

    1980-01-01

    In classes where the communicative approach to language teaching is taken and where learners are asked to form groups in order to communicate, the teacher should be ready to respond to requests, give immediate correction, and use a monitoring sheet to note errors. The sheet can also be used for individual students. (PJM)

  15. Writing: Revisions and Corrections

    ERIC Educational Resources Information Center

    Kohl, Herb

    1978-01-01

    A fifth grader wanted to know what he had to do to get all his ideas the way he wanted them in his story writing "and" have the spelling, punctuation and quotation marks correctly styled. His teacher encouraged him to think about writing as a process and provided the student with three steps as guidelines for effective writing. (Author/RK)

  16. Exposure Corrections for Macrophotography

    ERIC Educational Resources Information Center

    Nikolic, N. M.

    1976-01-01

    Describes a method for determining the exposure correction factors in close-up photography and macrophotography. The method eliminates all calculations during picture-taking, and allows the use of a light meter to obtain the proper f-stop/exposure time combinations. (Author/MLH)

  17. Passive localization in ocean acoustics: A model-based approach

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1995-09-01

    A model-based approach is developed to solve the passive localization problem in ocean acoustics using the state-space formulation for the first time. It is shown that the inherent structure of the resulting processor consists of a parameter estimator coupled to a nonlinear optimization scheme. The parameter estimator is designed using the model-based approach in which an ocean acoustic propagation model is used in developing the model-based processor required for localization. Recall that model-based signal processing is a well-defined methodology enabling the inclusion of environmental (propagation) models, measurement (sensor arrays) models, and noise (shipping, measurement) models into a sophisticated processing algorithm. Here the parameter estimator is designed, or more appropriately the model-based identifier (MBID) for a propagation model developed from a shallow water ocean experiment. After simulation, it is then applied to a set of experimental data demonstrating the applicability of this approach. {copyright} {ital 1995} {ital Acoustical} {ital Society} {ital of} {ital America}.

  18. Model-based ocean acoustic passive localization. Revision 1

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-06-01

    A model-based approach is developed (theoretically) to solve the passive localization problem. Here the authors investigate the design of a model-based identifier for a shallow water ocean acoustic problem characterized by a normal-mode model. In this problem they show how the processor can be structured to estimate the vertical wave numbers directly from measured pressure-field and sound speed measurements thereby eliminating the need for synthetic aperture processing or even a propagation model solution. Finally, they investigate various special cases of the source localization problem, designing a model-based localizer for each and evaluating the underlying structure with the expectation of gaining more and more insight into the general problem.

  19. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  20. Management of antipsychotic treatment discontinuation and interruptions using model-based simulations

    PubMed Central

    Samtani, Mahesh N; Sheehan, John J; Fu, Dong-Jing; Remmerie, Bart; Sliwa, Jennifer Kern; Alphs, Larry

    2012-01-01

    Background Medication nonadherence is a well described and prevalent clinical occurrence in schizophrenia. These pharmacokinetic model-based simulations analyze predicted antipsychotic plasma concentrations in nonadherence and treatment interruption scenarios and with treatment reinitiation. Methods Starting from steady state, pharmacokinetic model-based simulations of active moiety plasma concentrations of oral, immediate-release risperidone 3 mg/day, risperidone long-acting injection 37.5 mg/14 days, oral paliperidone extended-release 6 mg/day, and paliperidone palmitate 117 mg (75 mg equivalents)/28 days were assessed under three treatment discontinuation/interruption scenarios, ie, complete discontinuation, one week of interruption, and four weeks of interruption. In the treatment interruption scenarios, pharmacokinetic simulations were performed using medication-specific reinitiation strategies. Results Following complete treatment discontinuation, plasma concentrations persisted longest with paliperidone palmitate, followed by risperidone long-acting injection, while oral formulations exhibited the most rapid decrease. One week of oral paliperidone or risperidone interruption resulted in near complete elimination from the systemic circulation within that timeframe, reflecting the rapid elimination rate of the active moiety. After 1 and 4 weeks of interruption, minimum plasma concentrations were higher with paliperidone palmitate than risperidone long-acting injection over the simulated period. Four weeks of treatment interruption followed by reinitiation resulted in plasma levels returning to predicted therapeutic levels within 1 week. Conclusion Due to the long half-life of paliperidone palmitate (25–49 days), putative therapeutic plasma concentrations persisted longest in simulated cases of complete discontinuation or treatment interruption. These simulations may help clinicians better conceptualize the impact of antipsychotic nonadherence on plasma

  1. When Does Model-Based Control Pay Off?

    PubMed

    Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J

    2016-08-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  2. Therapeutic proteins: A to Z.

    PubMed

    Ozgur, Aykut; Tutar, Yusuf

    2013-12-01

    In recent years, therapeutic proteins have become an important growing class of drugs in the pharmaceutics industry. The development of recombinant DNA technology has caused to appreciation of therapeutic value of many proteins and peptides in medicine. Currently, approximately 100 therapeutic proteins obtained approval from Food and Drug Administration (FDA) and they are widely used in the treatment of various diseases such as cancer, diabetes, anemia and infections. This paper will summarize the production processes, pharmaceuticals and physicochemical properties and important classes of therapeutic proteins with their potential use in clinical applications. PMID:24261980

  3. Therapeutic cloning: The ethical limits

    SciTech Connect

    Whittaker, Peter A. . E-mail: p.whittaker@lancaster.ac.uk

    2005-09-01

    A brief outline of stem cells, stem cell therapy and therapeutic cloning is given. The position of therapeutic cloning with regard to other embryonic manipulations - IVF-based reproduction, embryonic stem formation from IVF embryos and reproductive cloning - is indicated. The main ethically challenging stages in therapeutic cloning are considered to be the nuclear transfer process including the source of eggs for this and the destruction of an embryo to provide stem cells for therapeutic use. The extremely polarised nature of the debate regarding the status of an early human embryo is noted, and some potential alternative strategies for preparing immunocompatible pluripotent stem cells are indicated.

  4. Clinical applications of therapeutic phlebotomy

    PubMed Central

    Kim, Kyung Hee; Oh, Ki Young

    2016-01-01

    Phlebotomy is the removal of blood from the body, and therapeutic phlebotomy is the preferred treatment for blood disorders in which the removal of red blood cells or serum iron is the most efficient method for managing the symptoms and complications. Therapeutic phlebotomy is currently indicated for the treatment of hemochromatosis, polycythemia vera, porphyria cutanea tarda, sickle cell disease, and nonalcoholic fatty liver disease with hyperferritinemia. This review discusses therapeutic phlebotomy and the related disorders and also offers guidelines for establishing a therapeutic phlebotomy program. PMID:27486346

  5. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  6. Paediatric models in motion: requirements for model-based decision support at the bedside.

    PubMed

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  7. Paediatric models in motion: requirements for model-based decision support at the bedside

    PubMed Central

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  8. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  9. Verification and Validation of Model-Based Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  10. A Model Based Mars Climate Database for the Mission Design

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A viewgraph presentation on a model based climate database is shown. The topics include: 1) Why a model based climate database?; 2) Mars Climate Database v3.1 Who uses it ? (approx. 60 users!); 3) The new Mars Climate database MCD v4.0; 4) MCD v4.0: what's new ? 5) Simulation of Water ice clouds; 6) Simulation of Water ice cycle; 7) A new tool for surface pressure prediction; 8) Acces to the database MCD 4.0; 9) How to access the database; and 10) New web access

  11. Experimental Therapeutics for Dystonia

    PubMed Central

    Jinnah, H. A.; Hess, Ellen J.

    2008-01-01

    Dystonia is a neurological syndrome characterized by excessive involuntary muscle contractions leading to twisting movements and unnatural postures. It has many different clinical manifestations, and many different causes. More than 3 million people worldwide suffer from dystonia, yet there are few broadly effective treatments. In the past decade, progress in research has advanced our understanding of the pathogenesis of dystonia to a point where drug discovery efforts are now feasible. There are several strategies that can be used to develop novel therapeutics for dystonia. Existing therapies have only modest efficacy, but may be refined and improved to increase benefits while reducing side effects. Identifying rational targets for drug intervention based on the pathogenesis of dystonia is another strategy. The surge in both basic and clinical research discoveries has provided insights at all levels including etiological, physiological and nosological, to enable such a targeted approach. The empirical approach to drug discovery is complementary to the rational approach whereby compounds are identified using a non-mechanistic strategy. [MD1] With the recent development of multiple animal models of dystonia, it is now possible to develop assays and perform drug screens on vast number of compounds. This multifaceted approach to drug discovery in dystonia will likely provide lead compounds that can then be translated for clinical use. PMID:18394563

  12. Therapeutic Cancer Vaccines.

    PubMed

    Ye, Zhenlong; Li, Zhong; Jin, Huajun; Qian, Qijun

    2016-01-01

    Cancer is one of the major leading death causes of diseases. Prevention and treatment of cancer is an important way to decrease the incidence of tumorigenesis and prolong patients' lives. Subversive achievements on cancer immunotherapy have recently been paid much attention after many failures in basic and clinical researches. Based on deep analysis of genomics and proteomics of tumor antigens, a variety of cancer vaccines targeting tumor antigens have been tested in preclinical and human clinical trials. Many therapeutic cancer vaccines alone or combination with other conventional treatments for cancer obtained spectacular efficacy, indicating the tremendously potential application in clinic. With the illustration of underlying mechanisms of cancer immune regulation, valid, controllable, and persistent cancer vaccines will play important roles in cancer treatment, survival extension and relapse and cancer prevention. This chapter mainly summarizes the recent progresses and developments on cancer vaccine research and clinical application, thus exploring the existing obstacles in cancer vaccine research and promoting the efficacy of cancer vaccine. PMID:27240458

  13. [Hypercholesterolemia: a therapeutic approach].

    PubMed

    Moráis López, A; Lama More, R A; Dalmau Serra, J

    2009-05-01

    High blood cholesterol levels represent an important cardiovascular risk factor. Hypercholesterolemia is defined as levels of total cholesterol and low-density lipoprotein cholesterol above 95th percentile for age and gender. For the paediatric population, selective screening is recommended in children older than 2 years who are overweight, with a family history of early cardiovascular disease or whose parents have high cholesterol levels. Initial therapeutic approach includes diet therapy, appropriate physical activity and healthy lifestyle changes. Drug treatment should be considered in children from the age of 10 who, after having followed appropriate diet recommendations, still have very high LDL-cholesterol levels or moderately high levels with concomitant risk factors. In case of extremely high LDL-cholesterol levels, drug treatment should be taken into consideration at earlier ages (8 years old). Modest response is usually observed with bile acid-binding resins. Statins can be considered first-choice drugs, once evidence on their efficacy and safety has been shown. PMID:19427823

  14. Leech Therapeutic Applications

    PubMed Central

    Abdualkader, A. M.; Ghawi, A. M.; Alaama, M.; Awang, M.; Merzouk, A.

    2013-01-01

    Hematophagous animals including leeches have been known to possess biologically active compounds in their secretions, especially in their saliva. The blood-sucking annelids, leeches have been used for therapeutic purposes since the beginning of civilization. Ancient Egyptian, Indian, Greek and Arab physicians used leeches for a wide range of diseases starting from the conventional use for bleeding to systemic ailments, such as skin diseases, nervous system abnormalities, urinary and reproductive system problems, inflammation, and dental problems. Recently, extensive researches on leech saliva unveiled the presence of a variety of bioactive peptides and proteins involving antithrombin (hirudin, bufrudin), antiplatelet (calin, saratin), factor Xa inhibitors (lefaxin), antibacterial (theromacin, theromyzin) and others. Consequently, leech has made a comeback as a new remedy for many chronic and life-threatening abnormalities, such as cardiovascular problems, cancer, metastasis, and infectious diseases. In the 20th century, leech therapy has established itself in plastic and microsurgery as a protective tool against venous congestion and served to salvage the replanted digits and flaps. Many clinics for plastic surgery all over the world started to use leeches for cosmetic purposes. Despite the efficacious properties of leech therapy, the safety, and complications of leeching are still controversial. PMID:24019559

  15. Plasmids encoding therapeutic agents

    DOEpatents

    Keener, William K.

    2007-08-07

    Plasmids encoding anti-HIV and anti-anthrax therapeutic agents are disclosed. Plasmid pWKK-500 encodes a fusion protein containing DP178 as a targeting moiety, the ricin A chain, an HIV protease cleavable linker, and a truncated ricin B chain. N-terminal extensions of the fusion protein include the maltose binding protein and a Factor Xa protease site. C-terminal extensions include a hydrophobic linker, an L domain motif peptide, a KDEL ER retention signal, another Factor Xa protease site, an out-of-frame buforin II coding sequence, the lacZ.alpha. peptide, and a polyhistidine tag. More than twenty derivatives of plasmid pWKK-500 are described. Plasmids pWKK-700 and pWKK-800 are similar to pWKK-500 wherein the DP178-encoding sequence is substituted by RANTES- and SDF-1-encoding sequences, respectively. Plasmid pWKK-900 is similar to pWKK-500 wherein the HIV protease cleavable linker is substituted by a lethal factor (LF) peptide-cleavable linker.

  16. [Liver metastasis: therapeutic strategy].

    PubMed

    Gennari, L; Doci, R; Bignami, P

    1996-01-01

    The liver is one of the most frequent sites of metastatic growth, in particular from digestive malignancies (DM). The first goal is to reduce the incidence of metastases. Adjuvant systemic chemotherapies have been demonstrated to reduce the recurrence rate and to improve survival in Dukes C colon cancer. Fluorouracil is the pivot of adjuvant treatment modulated by Leucovorin or Levamisol. A short postoperative administration of fluorouracil by intraportal route has been tested, but the results are controversial. Adjuvant treatments for different DM are under investigation. When hepatic metastases are clinically evident, therapeutic decisions depend on several factors: site and nature of primary, extent of hepatic and extrahepatic disease, patient characteristics, efficacy of treatments. A staging system should be adopted to allow a rational approach. In selected cases a locoregional treatment can achieve consistent results. Hepatic Intrarterial Chemotherapy (HIAC) for colorectal metastases achieves objective responses in more than 50% of patients. Survival seems positively affected. When feasible, Ro hepatic resection is the most effective treatment, five-year survival rate being about 30% when metastases are from colorectal cancer. Since the liver is the most frequent site of recurrence after resection, repeat resection have been successfully performed. PMID:9214269

  17. OPC modeling and correction solutions for EUV lithography

    NASA Astrophysics Data System (ADS)

    Word, James; Zuniga, Christian; Lam, Michael; Habib, Mohamed; Adam, Kostas; Oliver, Michael

    2011-11-01

    The introduction of EUV lithography into the semiconductor fabrication process will enable a continuation of Moore's law below the 22nm technology node. EUV lithography will, however, introduce new sources of patterning distortions which must be accurately modeled and corrected with software. Flare caused by scattered light in the projection optics result in pattern density-dependent imaging errors. The combination of non-telecentric reflective optics with reflective reticles results in mask shadowing effects. Reticle absorber materials are likely to have non-zero reflectivity due to a need to balance absorber stack height with minimization of mask shadowing effects. Depending upon placement of adjacent fields on the wafer, reflectivity along their border can result in inter-field imaging effects near the edge of neighboring exposure fields. Finally, there exists the ever-present optical proximity effects caused by diffractionlimited imaging and resist and etch process effects. To enable EUV lithography in production, it is expected that OPC will be called-upon to compensate for most of these effects. With the anticipated small imaging error budgets at sub-22nm nodes it is highly likely that only full model-based OPC solutions will have the required accuracy. The authors will explore the current capabilities of model-based OPC software to model and correct for each of the EUV imaging effects. Modeling, simulation, and correction methodologies will be defined, and experimental results of a full model-based OPC flow for EUV lithography will be presented.

  18. Correction coil cable

    DOEpatents

    Wang, Sou-Tien

    1994-11-01

    A wire cable assembly (10, 310) adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies (532) for the superconducting super collider. The correction coil cables (10, 310) have wires (14, 314) collected in wire arrays (12, 312) with a center rib (16, 316) sandwiched therebetween to form a core assembly (18, 318 ). The core assembly (18, 318) is surrounded by an assembly housing (20, 320) having an inner spiral wrap (22, 322) and a counter wound outer spiral wrap (24, 324). An alternate embodiment (410) of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable (410) on a particle tube (733) in a particle tube assembly (732).

  19. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  20. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  1. Correction and updating.

    PubMed

    1994-03-01

    In the heading of David Cassidy's review of The Private Lives of Albert Einstein (18 February, p. 997) the price of the book as sold by its British publisher, Faber and Faber, was given incorrectly; the correct price is pound15.99. The book is also to be published in the United States by St. Martin's Press, New York, in April, at a price of $23.95. PMID:17817438

  2. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  3. Therapeutics in Huntington's Disease.

    PubMed

    Killoran, Annie; Biglan, Kevin M

    2012-02-01

    OPINION STATEMENT: There is no specific treatment for Huntington's disease (HD). Its many symptoms of motor, psychiatric, and cognitive deterioration are managed with symptomatic relief, rehabilitation, and support. The only drug approved by the US Food and Drug Administration (FDA) for the treatment of HD is an antichoreic agent, tetrabenazine, but this drug is used sparingly because of uneasiness regarding its propensity to cause depression and suicidality in this population, which is already at risk for these complications. Neuroleptics are still first-line treatments for chorea accompanied by comorbid depression and/or behavioral or psychotic symptoms, as is often the case. Psychiatric features, which have a significant impact on a patient's professional and personal life, often become the major focus of management. In addition to neuroleptics, commonly used medications include antidepressants, mood stabilizers, anxiolytics, and psychostimulants. In contrast, few treatment options are available for cognitive impairment in HD; this remains an important and largely unmet therapeutic need. HD patients typically lack insight into their disease manifestations, failing to recognize their need for treatment, and possibly even arguing against it. Multipurpose medications are employed advantageously to simplify the medication regimen, so as to facilitate compliance and not overwhelm the patient. For example, haloperidol can be prescribed for a patient with chorea, agitation, and anorexia, rather than targeting each symptom with a different drug. This approach also limits the potential for adverse effects, which can be difficult to distinguish from the features of the disease itself. With HD's complexity, it is best managed with a multidisciplinary approach that includes a movement disorders specialist, a genetic counselor, a mental health professional, a physical therapist, and a social worker for support and coordination of services. As the disease progresses, there

  4. Therapeutic Devices for Epilepsy

    PubMed Central

    Fisher, Robert S.

    2011-01-01

    Therapeutic devices provide new options for treating drug-resistant epilepsy. These devices act by a variety of mechanisms to modulate neuronal activity. Only vagus nerve stimulation, which continues to develop new technology, is approved for use in the United States. Deep brain stimulation (DBS) of anterior thalamus for partial epilepsy recently was approved in Europe and several other countries. Responsive neurostimulation, which delivers stimuli to one or two seizure foci in response to a detected seizure, recently completed a successful multicenter trial. Several other trials of brain stimulation are in planning or underway. Transcutaneous magnetic stimulation (TMS) may provide a noninvasive method to stimulate cortex. Controlled studies of TMS split on efficacy, and may depend on whether a seizure focus is near a possible region for stimulation. Seizure detection devices in the form of “shake” detectors via portable accelerometers can provide notification of an ongoing tonic-clonic seizure, or peace of mind in the absence of notification. Prediction of seizures from various aspects of EEG is in early stages. Prediction appears to be possible in a subpopulation of people with refractory seizures and a clinical trial of an implantable prediction device is underway. Cooling of neocortex or hippocampus reversibly can attenuate epileptiform EEG activity and seizures, but engineering problems remain in its implementation. Optogenetics is a new technique that can control excitability of specific populations of neurons with light. Inhibition of epileptiform activity has been demonstrated in hippocampal slices, but use in humans will require more work. In general, devices provide useful palliation for otherwise uncontrollable seizures, but with a different risk profile than with most drugs. Optimizing the place of devices in therapy for epilepsy will require further development and clinical experience. PMID:22367987

  5. Impact of Model-Based Teaching on Argumentation Skills

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral; Belek, Deniz Eren

    2014-01-01

    The purpose of this study was to examine effects of model-based teaching on students' argumentation skills. Experimental design guided to the research. The participants of the study were pre-service physics teachers. The argumentative intervention lasted seven weeks. Data for this research were collected via video recordings and written…

  6. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  7. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  8. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    SciTech Connect

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-15

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated.

  9. Problem Solving: Physics Modeling-Based Interactive Engagement

    ERIC Educational Resources Information Center

    Ornek, Funda

    2009-01-01

    The purpose of this study was to investigate how modeling-based instruction combined with an interactive-engagement teaching approach promotes students' problem solving abilities. I focused on students in a calculus-based introductory physics course, based on the matter and interactions curriculum of Chabay & Sherwood (2002) at a large state…

  10. Cognitive control predicts use of model-based reinforcement learning.

    PubMed

    Otto, A Ross; Skatova, Anya; Madlon-Kay, Seth; Daw, Nathaniel D

    2015-02-01

    Accounts of decision-making and its neural substrates have long posited the operation of separate, competing valuation systems in the control of choice behavior. Recent theoretical and experimental work suggest that this classic distinction between behaviorally and neurally dissociable systems for habitual and goal-directed (or more generally, automatic and controlled) choice may arise from two computational strategies for reinforcement learning (RL), called model-free and model-based RL, but the cognitive or computational processes by which one system may dominate over the other in the control of behavior is a matter of ongoing investigation. To elucidate this question, we leverage the theoretical framework of cognitive control, demonstrating that individual differences in utilization of goal-related contextual information--in the service of overcoming habitual, stimulus-driven responses--in established cognitive control paradigms predict model-based behavior in a separate, sequential choice task. The behavioral correspondence between cognitive control and model-based RL compellingly suggests that a common set of processes may underpin the two behaviors. In particular, computational mechanisms originally proposed to underlie controlled behavior may be applicable to understanding the interactions between model-based and model-free choice behavior. PMID:25170791