Science.gov

Sample records for model-based therapeutic correction

  1. Model-Based Therapeutic Correction of Hypothalamic-Pituitary-Adrenal Axis Dysfunction

    PubMed Central

    Ben-Zvi, Amos; Vernon, Suzanne D.; Broderick, Gordon

    2009-01-01

    The hypothalamic-pituitary-adrenal (HPA) axis is a major system maintaining body homeostasis by regulating the neuroendocrine and sympathetic nervous systems as well modulating immune function. Recent work has shown that the complex dynamics of this system accommodate several stable steady states, one of which corresponds to the hypocortisol state observed in patients with chronic fatigue syndrome (CFS). At present these dynamics are not formally considered in the development of treatment strategies. Here we use model-based predictive control (MPC) methodology to estimate robust treatment courses for displacing the HPA axis from an abnormal hypocortisol steady state back to a healthy cortisol level. This approach was applied to a recent model of HPA axis dynamics incorporating glucocorticoid receptor kinetics. A candidate treatment that displays robust properties in the face of significant biological variability and measurement uncertainty requires that cortisol be further suppressed for a short period until adrenocorticotropic hormone levels exceed 30% of baseline. Treatment may then be discontinued, and the HPA axis will naturally progress to a stable attractor defined by normal hormone levels. Suppression of biologically available cortisol may be achieved through the use of binding proteins such as CBG and certain metabolizing enzymes, thus offering possible avenues for deployment in a clinical setting. Treatment strategies can therefore be designed that maximally exploit system dynamics to provide a robust response to treatment and ensure a positive outcome over a wide range of conditions. Perhaps most importantly, a treatment course involving further reduction in cortisol, even transient, is quite counterintuitive and challenges the conventional strategy of supplementing cortisol levels, an approach based on steady-state reasoning. PMID:19165314

  2. The application criterion of model-based optical proximity correction in a low k1 process

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Youl; Kim, In-Sung; Jung, Sung-Gon; Jung, Myoung-Ho; Park, Joo-On; Oh, Seok-Hwan; Woo, Sang-Gyun; Cho, Han-Ku; Moon, Joo-Tae

    2005-05-01

    As k1 factor approaches the theoretical limit, optical proximity correction (OPC) treatments necessary to maintain dimensional tolerances involve increasingly complex correction shapes. This translates to more detailed, or larger mask pattern databases. Moreover, development of exposure tools lags behind the shrinkage of device. This may result in dwindling of process margin in lighographic process despite using all possible resolution enhancement techniques (RETs). Although model-based OPC may lose its effectiveness in case of narrower photolithographic process margin, model-based OPC is recognized as a robust tool to cope with the diversity of layout. By the way, in case of narrower photolithographic process margin, model-based OPC lose its effectiveness. To enhance the usefulness of the OPC, we need to overcome many obstacles. It is supposed that the original layout be designed friendly to lithography to enhance the process margin using aggressive RETs, and is amended by model-based OPC to suppress the proximity effect. But, some constraints are found during an OPC procedure. Ultimately, unless the original lithgraphy friendly layout (LFL) is corrected in terms of pitches and shapes, the lithography process is out of process window as well as makes pattern fidelity poor. This paper emphasizes that the application of model-based OPC requires a particular and unique layout configuration to preserve the process margin in the low k1 process.

  3. Quantitative fully 3D PET via model-based scatter correction

    SciTech Connect

    Ollinger, J.M.

    1994-05-01

    We have investigated the quantitative accuracy of fully 3D PET using model-based scatter correction by measuring the half-life of Ga-68 in the presence of scatter from F-18. The inner chamber of a Data Spectrum cardiac phantom was filled with 18.5 MBq of Ga-68. The outer chamber was filled with an equivalent amount of F-18. The cardiac phantom was placed in a 22x30.5 cm elliptical phantom containing anthropomorphic lung inserts filled with a water-Styrofoam mixture. Ten frames of dynamic data were collected over 13.6 hours on Siemens-CTI 953B scanner with the septa retracted. The data were corrected using model-based scatter correction, which uses the emission images, transmission images and an accurate physical model to directly calculate the scatter distribution. Both uncorrected and corrected data were reconstructed using the Promis algorithm. The scatter correction required 4.3% of the total reconstruction time. The scatter fraction in a small volume of interest in the center of the inner chamber of the cardiac insert rose from 4.0% in the first interval to 46.4% in the last interval as the ratio of F-18 activity to Ga-68 activity rose from 1:1 to 33:1. Fitting a single exponential to the last three data points yields estimates of the half-life of Ga-68 of 77.01 minutes and 68.79 minutes for uncorrected and corrected data respectively. Thus, scatter correction reduces the error from 13.3% to 1.2%. This suggests that model-based scatter correction is accurate in the heterogeneous attenuating medium found in the chest, making possible quantitative, fully 3D PET in the body.

  4. Reduction of large set data transmission using algorithmically corrected model-based techniques for bandwidth efficiency

    NASA Astrophysics Data System (ADS)

    Khair, Joseph Daniel

    Communication requirements and demands on deployed systems are increasing daily. This increase is due to the desire for more capability, but also, due to the changing landscape of threats on remote vehicles. As such, it is important that we continue to find new and innovative ways to transmit data to and from these remote systems, consistent with this changing landscape. Specifically, this research shows that data can be transmitted to a remote system effectively and efficiently with a model-based approach using real-time updates, called Algorithmically Corrected Model-based Technique (ACMBT), resulting in substantial savings in communications overhead. To demonstrate this model-based data transmission technique, a hardware-based test fixture was designed and built. Execution and analysis software was created to perform a series of characterizations demonstrating the effectiveness of the new transmission method. The new approach was compared to a traditional transmission approach in the same environment, and the results were analyzed and presented. A Figure of Merit (FOM) was devised and presented to allow standardized comparison of traditional and proposed data transmission methodologies alongside bandwidth utilization metrics. The results of this research have successfully shown the model-based technique to be feasible. Additionally, this research has opened the trade space for future discussion and implementation of this technique.

  5. Automated model-based bias field correction of MR images of the brain.

    PubMed

    Van Leemput, K; Maes, F; Vandermeulen, D; Suetens, P

    1999-10-01

    We propose a model-based method for fully automated bias field correction of MR brain images. The MR signal is modeled as a realization of a random process with a parametric probability distribution that is corrupted by a smooth polynomial inhomogeneity or bias field. The method we propose applies an iterative expectation-maximization (EM) strategy that interleaves pixel classification with estimation of class distribution and bias field parameters, improving the likelihood of the model parameters at each iteration. The algorithm, which can handle multichannel data and slice-by-slice constant intensity offsets, is initialized with information from a digital brain atlas about the a priori expected location of tissue classes. This allows full automation of the method without need for user interaction, yielding more objective and reproducible results. We have validated the bias correction algorithm on simulated data and we illustrate its performance on various MR images with important field inhomogeneities. We also relate the proposed algorithm to other bias correction algorithms. PMID:10628948

  6. Autoregressive model based algorithm for correcting motion and serially correlated errors in fNIRS

    PubMed Central

    Barker, Jeffrey W.; Aarabi, Ardalan; Huppert, Theodore J.

    2013-01-01

    Systemic physiology and motion-induced artifacts represent two major sources of confounding noise in functional near infrared spectroscopy (fNIRS) imaging that can reduce the performance of analyses and inflate false positive rates (i.e., type I errors) of detecting evoked hemodynamic responses. In this work, we demonstrated a general algorithm for solving the general linear model (GLM) for both deconvolution (finite impulse response) and canonical regression models based on designing optimal pre-whitening filters using autoregressive models and employing iteratively reweighted least squares. We evaluated the performance of the new method by performing receiver operating characteristic (ROC) analyses using synthetic data, in which serial correlations, motion artifacts, and evoked responses were controlled via simulations, as well as using experimental data from children (3–5 years old) as a source baseline physiological noise and motion artifacts. The new method outperformed ordinary least squares (OLS) with no motion correction, wavelet based motion correction, or spline interpolation based motion correction in the presence of physiological and motion related noise. In the experimental data, false positive rates were as high as 37% when the estimated p-value was 0.05 for the OLS methods. The false positive rate was reduced to 5–9% with the proposed method. Overall, the method improves control of type I errors and increases performance when motion artifacts are present. PMID:24009999

  7. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  8. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance. PMID:25532366

  9. Correcting encoder interpolation error on the Green Bank Telescope using an iterative model based identification algorithm

    NASA Astrophysics Data System (ADS)

    Franke, Timothy; Weadon, Tim; Ford, John; Garcia-Sanz, Mario

    2015-10-01

    Various forms of measurement errors limit telescope tracking performance in practice. A new method for identifying the correcting coefficients for encoder interpolation error is developed. The algorithm corrects the encoder measurement by identifying a harmonic model of the system and using that model to compute the necessary correction parameters. The approach improves upon others by explicitly modeling the unknown dynamics of the structure and controller and by not requiring a separate system identification to be performed. Experience gained from pin-pointing the source of encoder error on the Green Bank Radio Telescope (GBT) is presented. Several tell-tale indicators of encoder error are discussed. Experimental data from the telescope, tested with two different encoders, are presented. Demonstration of the identification methodology on the GBT as well as details of its implementation are discussed. A root mean square tracking error reduction from 0.68 arc seconds to 0.21 arc sec was achieved by changing encoders and was further reduced to 0.10 arc sec with the calibration algorithm. In particular, the ubiquity of this error source is shown and how, by careful correction, it is possible to go beyond the advertised accuracy of an encoder.

  10. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  11. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror.

    PubMed

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  12. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking

    PubMed Central

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  13. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-01-01

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved. PMID:26561814

  14. Dixon sequence with superimposed model-based bone compartment provides highly accurate PET/MR attenuation correction of the brain

    PubMed Central

    Koesters, Thomas; Friedman, Kent P.; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Babb, James; Jelescu, Ileana O.; Faul, David; Boada, Fernando E.; Shepherd, Timothy M.

    2016-01-01

    Simultaneous PET/MR of the brain is a promising new technology for characterizing patients with suspected cognitive impairment or epilepsy. Unlike CT though, MR signal intensities do not provide a direct correlate to PET photon attenuation correction (AC) and inaccurate radiotracer standard uptake value (SUV) estimation could limit future PET/MR clinical applications. We tested a novel AC method that supplements standard Dixon-based tissue segmentation with a superimposed model-based bone compartment. Methods We directly compared SUV estimation for MR-based AC methods to reference CT AC in 16 patients undergoing same-day, single 18FDG dose PET/CT and PET/MR for suspected neurodegeneration. Three Dixon-based MR AC methods were compared to CT – standard Dixon 4-compartment segmentation alone, Dixon with a superimposed model-based bone compartment, and Dixon with a superimposed bone compartment and linear attenuation correction optimized specifically for brain tissue. The brain was segmented using a 3D T1-weighted volumetric MR sequence and SUV estimations compared to CT AC for whole-image, whole-brain and 91 FreeSurfer-based regions-of-interest. Results Modifying the linear AC value specifically for brain and superimposing a model-based bone compartment reduced whole-brain SUV estimation bias of Dixon-based PET/MR AC by 95% compared to reference CT AC (P < 0.05) – this resulted in a residual −0.3% whole-brain mean SUV bias. Further, brain regional analysis demonstrated only 3 frontal lobe regions with SUV estimation bias of 5% or greater (P < 0.05). These biases appeared to correlate with high individual variability in the frontal bone thickness and pneumatization. Conclusion Bone compartment and linear AC modifications result in a highly accurate MR AC method in subjects with suspected neurodegeneration. This prototype MR AC solution appears equivalent than other recently proposed solutions, and does not require additional MR sequences and scan time. These

  15. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  16. Efficient model-based dummy-fill OPC correction flow for deep sub-micron technology nodes

    NASA Astrophysics Data System (ADS)

    Hamouda, Ayman; Salama, Mohamed

    2014-09-01

    Dummy fill insertion is a necessary step in modern semiconductor technologies to achieve homogeneous pattern density per layer. This benefits several fabrication process steps including but not limited to Chemical Mechanical Polishing (CMP), Etching, and Packaging. As the technology keeps shrinking, fill shapes become more challenging to pattern and require aggressive model based optical proximity correction (MBOPC) to achieve better design fidelity. MBOPC on Fill is a challenge to mask data prep runtime and final mask shot count which would affect the total turnaround time (TAT) and mask cost. In our work, we introduce a novel flow that achieves a robust and computationally efficient fill handling methodology during mask data prep, which will keep both the runtime and shot count within their acceptable levels. In this flow, fill shapes undergo a smart MBOPC step which improves the final wafer printing quality and topography uniformity without degrading the final shot count or the OPC cycle runtime. This flow is tested on both front end of line (FEOL) layers and backend of line (BEOL) layers, and results in an improved final printing of the fill patterns while consuming less than 2% of the full MBOPC flow runtime.

  17. [Therapeutic correction of mild cognitive impairment in patients with chronic cerebral ischemia].

    PubMed

    Odinak, M M; Kashin, A V; Ememlin, A Iu; Lupanov, I A

    2013-01-01

    Neurodegenerative and cerebrovascular diseases are the most significant among the main reasons leading to the cognitive impairment of the elderly. Vascular cognitive impairment is not limited to only dementia, representing a heterogeneous group both in pathogenic and clinical terms. The article dwells upon new principles of vascular cognitive impairment's classification and the review of their possible therapeutic correction that was conducted. The article includes the results of the 12-week open therapeutic (randomized with the control group) study of efficiency and safety of vitrum memory for patients with mild vascular cognitive impairment. It is shown that the therapy significantly improved the state of neurodynamic and regulatory functions of the patients with I--II stage dyscirculatory encephalopathy. PMID:23739499

  18. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities.

    PubMed

    Warren, Keith L; Doogan, Nathan; De Leon, George; Phillips, Gary S; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  19. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    PubMed Central

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TC s) have a strong record of maintaining a high quality social climate on prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms and peer correction of behavior contrary to TC norms will lead to increased resident prosocial behavior. Laboratory experiments have demonstrated that such peer monitoring can lead to cooperation, but there has been no quantitative test of this hypothesis in an actual TC. In this article we test this assumption by using the affirmations that residents of three different TCs send as a measure of prosocial behavior following the reception of peer affirmations and corrections. At all three facilities residents send more affirmations following the reception of both affirmations and corrections, with this relationship being stronger and longer lasting after receiving affirmations. No other variable consistently predicts the number of affirmations that residents send to peers. These findings imply that mutual monitoring among TC residents can lead to increased levels of prosocial behavior within the facility, and that prosocial behavior in response to peer affirmations plays a key role. PMID:23935258

  20. A three-dimensional model-based partial volume correction strategy for gated cardiac mouse PET imaging

    NASA Astrophysics Data System (ADS)

    Dumouchel, Tyler; Thorn, Stephanie; Kordos, Myra; DaSilva, Jean; Beanlands, Rob S. B.; deKemp, Robert A.

    2012-07-01

    Quantification in cardiac mouse positron emission tomography (PET) imaging is limited by the imaging spatial resolution. Spillover of left ventricle (LV) myocardial activity into adjacent organs results in partial volume (PV) losses leading to underestimation of myocardial activity. A PV correction method was developed to restore accuracy of the activity distribution for FDG mouse imaging. The PV correction model was based on convolving an LV image estimate with a 3D point spread function. The LV model was described regionally by a five-parameter profile including myocardial, background and blood activities which were separated into three compartments by the endocardial radius and myocardium wall thickness. The PV correction was tested with digital simulations and a physical 3D mouse LV phantom. In vivo cardiac FDG mouse PET imaging was also performed. Following imaging, the mice were sacrificed and the tracer biodistribution in the LV and liver tissue was measured using a gamma-counter. The PV correction algorithm improved recovery from 50% to within 5% of the truth for the simulated and measured phantom data and image uniformity by 5-13%. The PV correction algorithm improved the mean myocardial LV recovery from 0.56 (0.54) to 1.13 (1.10) without (with) scatter and attenuation corrections. The mean image uniformity was improved from 26% (26%) to 17% (16%) without (with) scatter and attenuation corrections applied. Scatter and attenuation corrections were not observed to significantly impact PV-corrected myocardial recovery or image uniformity. Image-based PV correction algorithm can increase the accuracy of PET image activity and improve the uniformity of the activity distribution in normal mice. The algorithm may be applied using different tracers, in transgenic models that affect myocardial uptake, or in different species provided there is sufficient image quality and similar contrast between the myocardium and surrounding structures.

  1. Model-Based Assessment of Plasma Citrate Flux Into the Liver: Implications for NaCT as a Therapeutic Target.

    PubMed

    Li, Z; Erion, D M; Maurer, T S

    2016-03-01

    Cytoplasmic citrate serves as an important regulator of gluconeogenesis and carbon source for de novo lipogenesis in the liver. For this reason, the sodium-coupled citrate transporter (NaCT), a plasma membrane transporter that governs hepatic influx of plasma citrate in human, is being explored as a potential therapeutic target for metabolic disorders. As cytoplasmic citrate also originates from intracellular mitochondria, the relative contribution of these two pathways represents critical information necessary to underwrite confidence in this target. In this work, hepatic influx of plasma citrate was quantified via pharmacokinetic modeling of published clinical data. The influx was then compared to independent literature estimates of intracellular citrate flux in human liver. The results indicate that, under normal conditions, <10% of hepatic citrate originates from plasma. Similar estimates were determined experimentally in mice and rats. This suggests that NaCT inhibition will have a limited impact on hepatic citrate concentrations across species. PMID:27069776

  2. Evaluation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound.

    PubMed

    Clements, Logan W; Collins, Jarrod A; Weis, Jared A; Simpson, Amber L; Adams, Lauryn B; Jarnagin, William R; Miga, Michael I

    2016-01-01

    Soft-tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface-based metrics, and subsurface validation has largely been performed via phantom experiments. The proposed method involves the analysis of two deformation-correction algorithms for open hepatic image-guided surgery systems via subsurface targets digitized with tracked intraoperative ultrasound (iUS). Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration and for use in retrospective deformation-correction algorithms. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. Mean closest-point distances between the feature contours delineated in the iUS images and corresponding three-dimensional anatomical model generated from preoperative tomograms were computed to quantify the extent to which the deformation-correction algorithms improved registration accuracy. The results for six patients, including eight anatomical targets, indicate that deformation correction can facilitate reduction in target error of [Formula: see text]. PMID:27081664

  3. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  4. MODEL-BASED CORRECTION OF TISSUE COMPRESSION FOR TRACKED ULTRASOUND IN SOFT TISSUE IMAGE-GUIDED SURGERY

    PubMed Central

    Pheiffer, Thomas S.; Thompson, Reid C.; Rucker, Daniel C.; Simpson, Amber L.; Miga, Michael I.

    2014-01-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm. PMID:24412172

  5. Model-based correction of tissue compression for tracked ultrasound in soft tissue image-guided surgery.

    PubMed

    Pheiffer, Thomas S; Thompson, Reid C; Rucker, Daniel C; Simpson, Amber L; Miga, Michael I

    2014-04-01

    Acquisition of ultrasound data negatively affects image registration accuracy during image-guided therapy because of tissue compression by the probe. We present a novel compression correction method that models sub-surface tissue displacement resulting from application of a tracked probe to the tissue surface. Patient landmarks are first used to register the probe pose to pre-operative imaging. The ultrasound probe geometry is used to provide boundary conditions to a biomechanical model of the tissue. The deformation field solution of the model is inverted to non-rigidly transform the ultrasound images to an estimation of the tissue geometry before compression. Experimental results with gel phantoms indicated that the proposed method reduced the tumor margin modified Hausdorff distance (MHD) from 5.0 ± 1.6 to 1.9 ± 0.6 mm, and reduced tumor centroid alignment error from 7.6 ± 2.6 to 2.0 ± 0.9 mm. The method was applied to a clinical case and reduced the tumor margin MHD error from 5.4 ± 0.1 to 2.6 ± 0.1 mm and the centroid alignment error from 7.2 ± 0.2 to 3.5 ± 0.4 mm. PMID:24412172

  6. A Correction for the IRI Topside Electron Density Model Based on Alouette/ISIS Topside Sounder Data

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    2004-01-01

    The topside segment of the International Reference Ionosphere (IRI) electron density model (and also of the Bent model) is based on the limited amount of topside data available at the time (40,OOO Alouette 1 profiles). Being established from such a small database it is therefore not surprising that the models have well-known shortcomings, for example, at high solar activities. Meanwhile a large data base of close to 200,000 topside profiles from Alouette 1,2, and ISIS I, 2 has become available online. A program of automated scaling and inversion of a large volume of digitized ionograms adds continuously to this data pool. We have used the currently available ISIs/Alouette topside profiles to evaluate the IRI topside model and to investigate ways of improving the model. The IRI model performs generally well at middle latitudes and shows discrepancies at low and high latitudes and these discrepancies are largest during high solar activity. In the upper topside IRI consistently overestimates the measurements. Based on averages of the data-model ratios we have established correction factors for the IRI model. These factors vary with altitude, modified dip latitude, and local time.

  7. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in theNOTCH3gene.NOTCH3mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specificNOTCH3exons. Selection of these exons was achieved usingin silicostudies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targetedNOTCH3exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations. Transfection of

  8. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    SciTech Connect

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for building parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i.e. the

  9. Whole-Body PET/MR Imaging: Quantitative Evaluation of a Novel Model-Based MR Attenuation Correction Method Including Bone

    PubMed Central

    Paulus, Daniel H.; Quick, Harald H.; Geppert, Christian; Fenchel, Matthias; Zhan, Yiqiang; Hermosillo, Gerardo; Faul, David; Boada, Fernando; Friedman, Kent P.; Koesters, Thomas

    2016-01-01

    In routine whole-body PET/MR hybrid imaging, attenuation correction (AC) is usually performed by segmentation methods based on a Dixon MR sequence providing up to 4 different tissue classes. Because of the lack of bone information with the Dixon-based MR sequence, bone is currently considered as soft tissue. Thus, the aim of this study was to evaluate a novel model-based AC method that considers bone in whole-body PET/MR imaging. Methods The new method (“Model”) is based on a regular 4-compartment segmentation from a Dixon sequence (“Dixon”). Bone information is added using a model-based bone segmentation algorithm, which includes a set of prealigned MR image and bone mask pairs for each major body bone individually. Model was quantitatively evaluated on 20 patients who underwent whole-body PET/MR imaging. As a standard of reference, CT-based μ-maps were generated for each patient individually by nonrigid registration to the MR images based on PET/CT data. This step allowed for a quantitative comparison of all μ-maps based on a single PET emission raw dataset of the PET/MR system. Volumes of interest were drawn on normal tissue, soft-tissue lesions, and bone lesions; standardized uptake values were quantitatively compared. Results In soft-tissue regions with background uptake, the average bias of SUVs in background volumes of interest was 2.4% ± 2.5% and 2.7% ± 2.7% for Dixon and Model, respectively, compared with CT-based AC. For bony tissue, the −25.5% ± 7.9% underestimation observed with Dixon was reduced to −4.9% ± 6.7% with Model. In bone lesions, the average underestimation was −7.4% ± 5.3% and −2.9% ± 5.8% for Dixon and Model, respectively. For soft-tissue lesions, the biases were 5.1% ± 5.1% for Dixon and 5.2% ± 5.2% for Model. Conclusion The novel MR-based AC method for whole-body PET/MR imaging, combining Dixon-based soft-tissue segmentation and model-based bone estimation, improves PET quantification in whole-body hybrid PET

  10. Correction.

    PubMed

    2015-11-01

    In the article by Heuslein et al, which published online ahead of print on September 3, 2015 (DOI: 10.1161/ATVBAHA.115.305775), a correction was needed. Brett R. Blackman was added as the penultimate author of the article. The article has been corrected for publication in the November 2015 issue. PMID:26490278

  11. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera

    NASA Astrophysics Data System (ADS)

    Holstensson, M.; Erlandsson, K.; Poludniowski, G.; Ben-Haim, S.; Hutton, B. F.

    2015-04-01

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of 99mTc and 123I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of 99mTc and 123I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both 99mTc and 123I, results show that the estimated spatial distribution of events from 99mTc in the 99mTc photopeak energy window is very similar to that measured in a single 99mTc phantom study. The extracted images of primary events display increased cold lesion contrasts for both 99mTc and 123I.

  12. Correction.

    PubMed

    2015-12-01

    In the article by Narayan et al (Narayan O, Davies JE, Hughes AD, Dart AM, Parker KH, Reid C, Cameron JD. Central aortic reservoir-wave analysis improves prediction of cardiovascular events in elderly hypertensives. Hypertension. 2015;65:629–635. doi: 10.1161/HYPERTENSIONAHA.114.04824), which published online ahead of print December 22, 2014, and appeared in the March 2015 issue of the journal, some corrections were needed.On page 632, Figure, panel A, the label PRI has been corrected to read RPI. In panel B, the text by the upward arrow, "10% increase in kd,” has been corrected to read, "10% decrease in kd." The corrected figure is shown below.The authors apologize for these errors. PMID:26558821

  13. Correction

    NASA Astrophysics Data System (ADS)

    1995-04-01

    Seismic images of the Brooks Range, Arctic Alaska, reveal crustal-scale duplexing: Correction Geology, v. 23, p. 65 68 (January 1995) The correct Figure 4A, for the loose insert, is given here. See Figure 4A below. Corrected inserts will be available to those requesting copies of the article from the senior author, Gary S. Fuis, U.S. Geological Survey, 345 Middlefield Road, Menlo Park, CA 94025. Figure 4A. P-wave velocity model of Brooks Range region (thin gray contours) with migrated wide-angle reflections (heavy red lines) and migreated vertical-incidence reflections (short black lines) superimposed. Velocity contour interval is 0.25 km/s; 4,5, and 6 km/s contours are labeled. Estimated error in velocities is one contour interval. Symbols on faults shown at top are as in Figure 2 caption.

  14. Correction.

    PubMed

    2016-02-01

    Neogi T, Jansen TLTA, Dalbeth N, et al. 2015 Gout classification criteria: an American College of Rheumatology/European League Against Rheumatism collaborative initiative. Ann Rheum Dis 2015;74:1789–98. The name of the 20th author was misspelled. The correct spelling is Janitzia Vazquez-Mellado. We regret the error. PMID:26881284

  15. Model-based correction for scatter and tailing effects in simultaneous 99mTc and 123I imaging for a CdZnTe cardiac SPECT camera.

    PubMed

    Holstensson, M; Erlandsson, K; Poludniowski, G; Ben-Haim, S; Hutton, B F

    2015-04-21

    An advantage of semiconductor-based dedicated cardiac single photon emission computed tomography (SPECT) cameras when compared to conventional Anger cameras is superior energy resolution. This provides the potential for improved separation of the photopeaks in dual radionuclide imaging, such as combined use of (99m)Tc and (123)I . There is, however, the added complexity of tailing effects in the detectors that must be accounted for. In this paper we present a model-based correction algorithm which extracts the useful primary counts of (99m)Tc and (123)I from projection data. Equations describing the in-patient scatter and tailing effects in the detectors are iteratively solved for both radionuclides simultaneously using a maximum a posteriori probability algorithm with one-step-late evaluation. Energy window-dependent parameters for the equations describing in-patient scatter are estimated using Monte Carlo simulations. Parameters for the equations describing tailing effects are estimated using virtually scatter-free experimental measurements on a dedicated cardiac SPECT camera with CdZnTe-detectors. When applied to a phantom study with both (99m)Tc and (123)I, results show that the estimated spatial distribution of events from (99m)Tc in the (99m)Tc photopeak energy window is very similar to that measured in a single (99m)Tc phantom study. The extracted images of primary events display increased cold lesion contrasts for both (99m)Tc and (123)I. PMID:25803643

  16. Correction.

    PubMed

    2016-02-01

    In the article by Guessous et al (Guessous I, Pruijm M, Ponte B, Ackermann D, Ehret G, Ansermot N, Vuistiner P, Staessen J, Gu Y, Paccaud F, Mohaupt M, Vogt B, Pechère-Bertschi A, Martin PY, Burnier M, Eap CB, Bochud M. Associations of ambulatory blood pressure with urinary caffeine and caffeine metabolite excretions. Hypertension. 2015;65:691–696. doi: 10.1161/HYPERTENSIONAHA.114.04512), which published online ahead of print December 8, 2014, and appeared in the March 2015 issue of the journal, a correction was needed.One of the author surnames was misspelled. Antoinette Pechère-Berstchi has been corrected to read Antoinette Pechère-Bertschi.The authors apologize for this error. PMID:26763012

  17. Gene Transfer Corrects Acute GM2 Gangliosidosis—Potential Therapeutic Contribution of Perivascular Enzyme Flow

    PubMed Central

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-01-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay–Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity—as opposed to tremor-ataxia—were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue—long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  18. Gene transfer corrects acute GM2 gangliosidosis--potential therapeutic contribution of perivascular enzyme flow.

    PubMed

    Cachón-González, M Begoña; Wang, Susan Z; McNair, Rosamund; Bradley, Josephine; Lunn, David; Ziegler, Robin; Cheng, Seng H; Cox, Timothy M

    2012-08-01

    The GM2 gangliosidoses are fatal lysosomal storage diseases principally affecting the brain. Absence of β-hexosaminidase A and B activities in the Sandhoff mouse causes neurological dysfunction and recapitulates the acute Tay-Sachs (TSD) and Sandhoff diseases (SD) in infants. Intracranial coinjection of recombinant adeno-associated viral vectors (rAAV), serotype 2/1, expressing human β-hexosaminidase α (HEXA) and β (HEXB) subunits into 1-month-old Sandhoff mice gave unprecedented survival to 2 years and prevented disease throughout the brain and spinal cord. Classical manifestations of disease, including spasticity-as opposed to tremor-ataxia-were resolved by localized gene transfer to the striatum or cerebellum, respectively. Abundant biosynthesis of β-hexosaminidase isozymes and their global distribution via axonal, perivascular, and cerebrospinal fluid (CSF) spaces, as well as diffusion, account for the sustained phenotypic rescue-long-term protein expression by transduced brain parenchyma, choroid plexus epithelium, and dorsal root ganglia neurons supplies the corrective enzyme. Prolonged survival permitted expression of cryptic disease in organs not accessed by intracranial vector delivery. We contend that infusion of rAAV into CSF space and intraparenchymal administration by convection-enhanced delivery at a few strategic sites will optimally treat neurodegeneration in many diseases affecting the nervous system. PMID:22453766

  19. Correction.

    PubMed

    2015-05-22

    The Circulation Research article by Keith and Bolli (“String Theory” of c-kitpos Cardiac Cells: A New Paradigm Regarding the Nature of These Cells That May Reconcile Apparently Discrepant Results. Circ Res. 2015:116:1216-1230. doi: 10.1161/CIRCRESAHA.116.305557) states that van Berlo et al (2014) observed that large numbers of fibroblasts and adventitial cells, some smooth muscle and endothelial cells, and rare cardiomyocytes originated from c-kit positive progenitors. However, van Berlo et al reported that only occasional fibroblasts and adventitial cells derived from c-kit positive progenitors in their studies. Accordingly, the review has been corrected to indicate that van Berlo et al (2014) observed that large numbers of endothelial cells, with some smooth muscle cells and fibroblasts, and more rarely cardiomyocytes, originated from c-kit positive progenitors in their murine model. The authors apologize for this error, and the error has been noted and corrected in the online version of the article, which is available at http://circres.ahajournals.org/content/116/7/1216.full ( PMID:25999426

  20. Correction

    NASA Astrophysics Data System (ADS)

    1998-12-01

    Alleged mosasaur bite marks on Late Cretaceous ammonites are limpet (patellogastropod) home scars Geology, v. 26, p. 947 950 (October 1998) This article had the following printing errors: p. 947, Abstract, line 11, “sepia” should be “septa” p. 947, 1st paragraph under Introduction, line 2, “creep” should be “deep” p. 948, column 1, 2nd paragraph, line 7, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 1, “creep” should be “deep” p. 949, column 1, 1st paragraph, line 5, “19774” should be “1977)” p. 949, column 1, 4th paragraph, line 7, “in particular” should be “In particular” CORRECTION Mammalian community response to the latest Paleocene thermal maximum: An isotaphonomic study in the northern Bighorn Basin, Wyoming Geology, v. 26, p. 1011 1014 (November 1998) An error appeared in the References Cited. The correct reference appears below: Fricke, H. C., Clyde, W. C., O'Neil, J. R., and Gingerich, P. D., 1998, Evidence for rapid climate change in North America during the latest Paleocene thermal maximum: Oxygen isotope compositions of biogenic phosphate from the Bighorn Basin (Wyoming): Earth and Planetary Science Letters, v. 160, p. 193 208.

  1. Therapeutic correction of ApoER2 splicing in Alzheimer's disease mice using antisense oligonucleotides.

    PubMed

    Hinrich, Anthony J; Jodelka, Francine M; Chang, Jennifer L; Brutman, Daniella; Bruno, Angela M; Briggs, Clark A; James, Bryan D; Stutzmann, Grace E; Bennett, David A; Miller, Steven A; Rigo, Frank; Marr, Robert A; Hastings, Michelle L

    2016-01-01

    Apolipoprotein E receptor 2 (ApoER2) is an apolipoprotein E receptor involved in long-term potentiation, learning, and memory. Given its role in cognition and its association with the Alzheimer's disease (AD) risk gene, apoE, ApoER2 has been proposed to be involved in AD, though a role for the receptor in the disease is not clear. ApoER2 signaling requires amino acids encoded by alternatively spliced exon 19. Here, we report that the balance of ApoER2 exon 19 splicing is deregulated in postmortem brain tissue from AD patients and in a transgenic mouse model of AD To test the role of deregulated ApoER2 splicing in AD, we designed an antisense oligonucleotide (ASO) that increases exon 19 splicing. Treatment of AD mice with a single dose of ASO corrected ApoER2 splicing for up to 6 months and improved synaptic function and learning and memory. These results reveal an association between ApoER2 isoform expression and AD, and provide preclinical evidence for the utility of ASOs as a therapeutic approach to mitigate Alzheimer's disease symptoms by improving ApoER2 exon 19 splicing. PMID:26902204

  2. Travel cost demand model based river recreation benefit estimates with on-site and household surveys: Comparative results and a correction procedure

    NASA Astrophysics Data System (ADS)

    Loomis, John

    2003-04-01

    Past recreation studies have noted that on-site or visitor intercept surveys are subject to over-sampling of avid users (i.e., endogenous stratification) and have offered econometric solutions to correct for this. However, past papers do not estimate the empirical magnitude of the bias in benefit estimates with a real data set, nor do they compare the corrected estimates to benefit estimates derived from a population sample. This paper empirically examines the magnitude of the recreation benefits per trip bias by comparing estimates from an on-site river visitor intercept survey to a household survey. The difference in average benefits is quite large, with the on-site visitor survey yielding 24 per day trip, while the household survey yields 9.67 per day trip. A simple econometric correction for endogenous stratification in our count data model lowers the benefit estimate to $9.60 per day trip, a mean value nearly identical and not statistically different from the household survey estimate.

  3. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization

    PubMed Central

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F.; Pal, Saikat; McWalter, Emily J.; Beaupré, Gary S.; Maier, Andreas

    2013-01-01

    Purpose: Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. Methods: An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers’ location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4–12) and

  4. Influence of the partial volume correction method on 18F-fluorodeoxyglucose brain kinetic modelling from dynamic PET images reconstructed with resolution model based OSEM

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Byars, Larry G.; Michel, Christian J.; Chonde, Daniel B.; Catana, Ciprian

    2013-10-01

    Kinetic parameters estimated from dynamic 18F-fluorodeoxyglucose (18F-FDG) PET acquisitions have been used frequently to assess brain function in humans. Neglecting partial volume correction (PVC) for a dynamic series has been shown to produce significant bias in model estimates. Accurate PVC requires a space-variant model describing the reconstructed image spatial point spread function (PSF) that accounts for resolution limitations, including non-uniformities across the field of view due to the parallax effect. For ordered subsets expectation maximization (OSEM), image resolution convergence is local and influenced significantly by the number of iterations, the count density, and background-to-target ratio. As both count density and background-to-target values for a brain structure can change during a dynamic scan, the local image resolution may also concurrently vary. When PVC is applied post-reconstruction the kinetic parameter estimates may be biased when neglecting the frame-dependent resolution. We explored the influence of the PVC method and implementation on kinetic parameters estimated by fitting 18F-FDG dynamic data acquired on a dedicated brain PET scanner and reconstructed with and without PSF modelling in the OSEM algorithm. The performance of several PVC algorithms was quantified with a phantom experiment, an anthropomorphic Monte Carlo simulation, and a patient scan. Using the last frame reconstructed image only for regional spread function (RSF) generation, as opposed to computing RSFs for each frame independently, and applying perturbation geometric transfer matrix PVC with PSF based OSEM produced the lowest magnitude bias kinetic parameter estimates in most instances, although at the cost of increased noise compared to the PVC methods utilizing conventional OSEM. Use of the last frame RSFs for PVC with no PSF modelling in the OSEM algorithm produced the lowest bias in cerebral metabolic rate of glucose estimates, although by less than 5% in most

  5. The impact of dosimetric optimization using respiratory gating and inhomogeneity corrections on potential therapeutic gain in patients with lung cancer

    NASA Astrophysics Data System (ADS)

    de La Fuente Herman, Tania

    Early stage lung cancer is found with increasing frequency by screening high risk patients. Recently, the use of Stereotactic Body Radiation Therapy (SBRT) has been found to be highly successful. The hypothesis being tested here is that the use of respiratory gating and tissue heterogeneity corrections are necessary to optimize tumor and normal tissue dose distributions for SBRT.

  6. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC.

    PubMed

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  7. A Budget Impact Analysis of Newly Available Hepatitis C Therapeutics and the Financial Burden on a State Correctional System.

    PubMed

    Nguyen, John T; Rich, Josiah D; Brockmann, Bradley W; Vohr, Fred; Spaulding, Anne; Montague, Brian T

    2015-08-01

    Hepatitis C virus (HCV) infection continues to disproportionately affect incarcerated populations. New HCV drugs present opportunities and challenges to address HCV in corrections. The goal of this study was to evaluate the impact of the treatment costs for HCV infection in a state correctional population through a budget impact analysis comparing differing treatment strategies. Electronic and paper medical records were reviewed to estimate the prevalence of hepatitis C within the Rhode Island Department of Corrections. Three treatment strategies were evaluated as follows: (1) treating all chronically infected persons, (2) treating only patients with demonstrated fibrosis, and (3) treating only patients with advanced fibrosis. Budget impact was computed as the percentage of pharmacy and overall healthcare expenditures accrued by total drug costs assuming entirely interferon-free therapy. Sensitivity analyses assessed potential variance in costs related to variability in HCV prevalence, genotype, estimated variation in market pricing, length of stay for the sentenced population, and uptake of newly available regimens. Chronic HCV prevalence was estimated at 17% of the total population. Treating all sentenced inmates with at least 6 months remaining of their sentence would cost about $34 million-13 times the pharmacy budget and almost twice the overall healthcare budget. Treating inmates with advanced fibrosis would cost about $15 million. A hypothetical 50% reduction in total drug costs for future therapies could cost $17 million to treat all eligible inmates. With immense costs projected with new treatment, it is unlikely that correctional facilities will have the capacity to treat all those afflicted with HCV. Alternative payment strategies in collaboration with outside programs may be necessary to curb this epidemic. In order to improve care and treatment delivery, drug costs also need to be seriously reevaluated to be more accessible and equitable now that HCV

  8. Concurrent progress of reprogramming and gene correction to overcome therapeutic limitation of mutant ALK2-iPSC

    PubMed Central

    Kim, Bu-Yeo; Jeong, SangKyun; Lee, Seo-Young; Lee, So Min; Gweon, Eun Jeong; Ahn, Hyunjun; Kim, Janghwan; Chung, Sun-Ku

    2016-01-01

    Fibrodysplasia ossificans progressiva (FOP) syndrome is caused by mutation of the gene ACVR1, encoding a constitutive active bone morphogenetic protein type I receptor (also called ALK2) to induce heterotopic ossification in the patient. To genetically correct it, we attempted to generate the mutant ALK2-iPSCs (mALK2-iPSCs) from FOP-human dermal fibroblasts. However, the mALK2 leads to inhibitory pluripotency maintenance, or impaired clonogenic potential after single-cell dissociation as an inevitable step, which applies gene-correction tools to induced pluripotent stem cells (iPSCs). Thus, current iPSC-based gene therapy approach reveals a limitation that is not readily applicable to iPSCs with ALK2 mutation. Here we developed a simplified one-step procedure by simultaneously introducing reprogramming and gene-editing components into human fibroblasts derived from patient with FOP syndrome, and genetically treated it. The mixtures of reprogramming and gene-editing components are composed of reprogramming episomal vectors, CRISPR/Cas9-expressing vectors and single-stranded oligodeoxynucleotide harboring normal base to correct ALK2 c.617G>A. The one-step-mediated ALK2 gene-corrected iPSCs restored global gene expression pattern, as well as mineralization to the extent of normal iPSCs. This procedure not only helps save time, labor and costs but also opens up a new paradigm that is beyond the current application of gene-editing methodologies, which is hampered by inhibitory pluripotency-maintenance requirements, or vulnerability of single-cell-dissociated iPSCs. PMID:27256111

  9. Key factors which concur to the correct therapeutic evaluation of herbal products in free radical-induced diseases

    PubMed Central

    Mancuso, Cesare

    2015-01-01

    For many years now the world’s scientific literature has been perfused with articles on the therapeutic potential of natural products, the vast majority of which have herbal origins, as in the case of free radical-induced diseases. What is often overlooked is the effort of researchers who take into consideration the preclinical and clinical evaluation of these herbal products, in order to demonstrate the therapeutic efficacy and safety. The first critical issue to be addressed in the early stages of the preclinical studies is related to pharmacokinetics, which is sometimes not very favorable, of some of these products, which limits the bioavailability after oral intake. In this regard, it is worthy underlining how it is often unethical to propose the therapeutic efficacy of a compound on the basis of preclinical results obtained with far higher concentrations to those which, hopefully, could be achieved in organs and tissues of subjects taking these products by mouth. The most widely used approach to overcome the problem related to the low bioavailability involves the complexation of the active ingredients of herbal products with non-toxic carriers that facilitate the absorption and distribution. Even the induction or inhibition of drug metabolizing enzymes by herbal products, and the consequent variations of plasma concentrations of co-administered drugs, are phenomena to be carefully evaluated as they can give rise to side-effects. This risk is even greater when considering that people lack the perception of the risk arising from an over use of herbal products that, by their very nature, are considered risk-free. PMID:25954201

  10. Key factors which concur to the correct therapeutic evaluation of herbal products in free radical-induced diseases.

    PubMed

    Mancuso, Cesare

    2015-01-01

    For many years now the world's scientific literature has been perfused with articles on the therapeutic potential of natural products, the vast majority of which have herbal origins, as in the case of free radical-induced diseases. What is often overlooked is the effort of researchers who take into consideration the preclinical and clinical evaluation of these herbal products, in order to demonstrate the therapeutic efficacy and safety. The first critical issue to be addressed in the early stages of the preclinical studies is related to pharmacokinetics, which is sometimes not very favorable, of some of these products, which limits the bioavailability after oral intake. In this regard, it is worthy underlining how it is often unethical to propose the therapeutic efficacy of a compound on the basis of preclinical results obtained with far higher concentrations to those which, hopefully, could be achieved in organs and tissues of subjects taking these products by mouth. The most widely used approach to overcome the problem related to the low bioavailability involves the complexation of the active ingredients of herbal products with non-toxic carriers that facilitate the absorption and distribution. Even the induction or inhibition of drug metabolizing enzymes by herbal products, and the consequent variations of plasma concentrations of co-administered drugs, are phenomena to be carefully evaluated as they can give rise to side-effects. This risk is even greater when considering that people lack the perception of the risk arising from an over use of herbal products that, by their very nature, are considered risk-free. PMID:25954201

  11. Lack of correlation between outcomes of membrane repair assay and correction of dystrophic changes in experimental therapeutic strategy in dysferlinopathy.

    PubMed

    Lostal, William; Bartoli, Marc; Roudaut, Carinne; Bourg, Nathalie; Krahn, Martin; Pryadkina, Marina; Borel, Perrine; Suel, Laurence; Roche, Joseph A; Stockholm, Daniel; Bloch, Robert J; Levy, Nicolas; Bashir, Rumaisa; Richard, Isabelle

    2012-01-01

    Mutations in the dysferlin gene are the cause of Limb-girdle Muscular Dystrophy type 2B and Miyoshi Myopathy. The dysferlin protein has been implicated in sarcolemmal resealing, leading to the idea that the pathophysiology of dysferlin deficiencies is due to a deficit in membrane repair. Here, we show using two different approaches that fulfilling membrane repair as asseyed by laser wounding assay is not sufficient for alleviating the dysferlin deficient pathology. First, we generated a transgenic mouse overexpressing myoferlin to test the hypothesis that myoferlin, which is homologous to dysferlin, can compensate for the absence of dysferlin. The myoferlin overexpressors show no skeletal muscle abnormalities, and crossing them with a dysferlin-deficient model rescues the membrane fusion defect present in dysferlin-deficient mice in vitro. However, myoferlin overexpression does not correct muscle histology in vivo. Second, we report that AAV-mediated transfer of a minidysferlin, previously shown to correct the membrane repair deficit in vitro, also fails to improve muscle histology. Furthermore, neither myoferlin nor the minidysferlin prevented myofiber degeneration following eccentric exercise. Our data suggest that the pathogenicity of dysferlin deficiency is not solely related to impairment in sarcolemmal repair and highlight the care needed in selecting assays to assess potential therapies for dysferlinopathies. PMID:22666441

  12. Lack of Correlation between Outcomes of Membrane Repair Assay and Correction of Dystrophic Changes in Experimental Therapeutic Strategy in Dysferlinopathy

    PubMed Central

    Krahn, Martin; Pryadkina, Marina; Borel, Perrine; Suel, Laurence; Roche, Joseph A.; Stockholm, Daniel; Bloch, Robert J.; Levy, Nicolas; Bashir, Rumaisa; Richard, Isabelle

    2012-01-01

    Mutations in the dysferlin gene are the cause of Limb-girdle Muscular Dystrophy type 2B and Miyoshi Myopathy. The dysferlin protein has been implicated in sarcolemmal resealing, leading to the idea that the pathophysiology of dysferlin deficiencies is due to a deficit in membrane repair. Here, we show using two different approaches that fullfiling membrane repair as asseyed by laser wounding assay is not sufficient for alleviating the dysferlin deficient pathology. First, we generated a transgenic mouse overexpressing myoferlin to test the hypothesis that myoferlin, which is homologous to dysferlin, can compensate for the absence of dysferlin. The myoferlin overexpressors show no skeletal muscle abnormalities, and crossing them with a dysferlin-deficient model rescues the membrane fusion defect present in dysferlin-deficient mice in vitro. However, myoferlin overexpression does not correct muscle histology in vivo. Second, we report that AAV-mediated transfer of a minidysferlin, previously shown to correct the membrane repair deficit in vitro, also fails to improve muscle histology. Furthermore, neither myoferlin nor the minidysferlin prevented myofiber degeneration following eccentric exercise. Our data suggest that the pathogenicity of dysferlin deficiency is not solely related to impairment in sarcolemmal repair and highlight the care needed in selecting assays to assess potential therapies for dysferlinopathies. PMID:22666441

  13. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  14. Model-Based Systems

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    2007-01-01

    Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.

  15. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  16. Principles of models based engineering

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  17. Are therapeutic communities therapeutic for women?

    PubMed Central

    Eliason, Michele J

    2006-01-01

    This paper addresses the growing phenomena of therapeutic community (TC) treatment approaches for women in correctional settings. Although rapidly increasing in number across the country, there is very little empirical research to support the effectiveness of TC treatment for women. Therefore, the literature on the efficacy and effectiveness of TC treatment for women is reviewed in relation to the literature on women's treatment issues. The literature review highlights the gaps where TC treatment ignores or exacerbates issues that are common to addicted women, or uses methods that may be contradictory to women's recovery. PMID:16722560

  18. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  19. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H.; Lehman, Sean K.; Goodman, Dennis M.

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  20. Model-based machine learning

    PubMed Central

    Bishop, Christopher M.

    2013-01-01

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  1. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  2. Model-based machine learning.

    PubMed

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612

  3. MACROMOLECULAR THERAPEUTICS

    PubMed Central

    Yang, Jiyuan; Kopeček, Jindřich

    2014-01-01

    This review covers water-soluble polymer-drug conjugates and macromolecules that possess biological activity without attached low molecular weight drugs. The main design principles of traditional and backbone degradable polymer-drug conjugates as well as the development of a new paradigm in nanomedicines – (low molecular weight) drug-free macromolecular therapeutics are discussed. To address the biological features of cancer, macromolecular therapeutics directed to stem/progenitor cells and the tumor microenvironment are deliberated. Finally, the future perspectives of the field are briefly debated. PMID:24747162

  4. Pigeon therapeutics.

    PubMed

    Harlin, R W

    2000-01-01

    This article examines therapeutics for pigeons, discussing their physiology and reproduction, housing, and nutrition. The author also looks at ways to prevent infection, while discussing treatments for various viral diseases, such as paramyxovirus and pigeon herpesvirus, bacterial infections, such as paratyphoid, and parasitic diseases. Drug dosages are listed for antibiotics, antifungals, antiparasitics, and vaccines. PMID:11228828

  5. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen; Ruegsegger, Mark; Barnes, Philip; Smith, Bryan; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'être of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multistep work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self-assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  6. Therapeutic Nanodevices

    NASA Astrophysics Data System (ADS)

    Lee, Stephen C.; Ruegsegger, Mark; Barnes, Philip D.; Smith, Bryan R.; Ferrari, Mauro

    Therapeutic nanotechnology offers minimally invasive therapies with high densities of function concentrated in small volumes, features that may reduce patient morbidity and mortality. Unlike other areas of nanotechnology, novel physical properties associated with nanoscale dimensionality are not the raison d'etre of therapeutic nanotechnology, whereas the aggregation of multiple biochemical (or comparably precise) functions into controlled nanoarchitectures is. Multifunctionality is a hallmark of emerging nanotherapeutic devices, and multifunctionality can allow nanotherapeutic devices to perform multi-step work processes, with each functional component contributing to one or more nanodevice subroutine such that, in aggregate, subroutines sum to a cogent work process. Cannonical nanotherapeutic subroutines include tethering (targeting) to sites of disease, dispensing measured doses of drug (or bioactive compound), detection of residual disease after therapy and communication with an external clinician/operator. Emerging nanotherapeutics thus blur the boundaries between medical devices and traditional pharmaceuticals. Assembly of therapeutic nanodevices generally exploits either (bio)material self assembly properties or chemoselective bioconjugation techniques, or both. Given the complexity, composition, and the necessity for their tight chemical and structural definition inherent in the nature of nanotherapeutics, their cost of goods (COGs) might exceed that of (already expensive) biologics. Early therapeutic nanodevices will likely be applied to disease states which exhibit significant unmet patient need (cancer and cardiovascular disease), while application to other disease states well-served by conventional therapy may await perfection of nanotherapeutic design and assembly protocols.

  7. Feedlot therapeutics.

    PubMed

    Apley, M D; Fajt, V R

    1998-07-01

    This article discusses therapeutic approaches to conditions commonly encountered in feedlots. Challenges discussed include bovine respiratory complex, tracheal edema, atypical interstitial pneumonia, footrot, toe abscesses, mycoplasma arthritis, cardiovascular disease, lactic acidosis, bloat, coccidiosis, central nervous system diseases, abscesses and cellulitis, pregnancy management and abortion, and ocular disease. PMID:9704416

  8. Model-based reasoning: Troubleshooting

    NASA Astrophysics Data System (ADS)

    Davis, Randall; Hamscher, Walter C.

    1988-07-01

    To determine why something has stopped working, its useful to know how it was supposed to work in the first place. That simple observation underlies some of the considerable interest generated in recent years on the topic of model-based reasoning, particularly its application to diagnosis and troubleshooting. This paper surveys the current state of the art, reviewing areas that are well understood and exploring areas that present challenging research topics. It views the fundamental paradigm as the interaction of prediction and observation, and explores it by examining three fundamental subproblems: generating hypotheses by reasoning from a symptom to a collection of components whose misbehavior may plausibly have caused that symptom; testing each hypothesis to see whether it can account for all available observations of device behavior; then discriminating among the ones that survive testing. We analyze each of these independently at the knowledge level i.e., attempting to understand what reasoning capabilities arise from the different varieties of knowledge available to the program. We find that while a wide range of apparently diverse model-based systems have been built for diagnosis and troubleshooting, they are for the most part variations on the central theme outlined here. Their diversity lies primarily in the varying amounts of kinds of knowledge they bring to bear at each stage of the process; the underlying paradigm is fundamentally the same.

  9. Platelet-delivered therapeutics.

    PubMed

    Lyde, R; Sabatino, D; Sullivan, S K; Poncz, M

    2015-06-01

    We have proposed that modified platelets could potentially be used to correct intrinsic platelet defects as well as for targeted delivery of therapeutic molecules to sights of vascular injury. Ectopic expression of proteins within α-granules prior to platelet activation has been achieved for several proteins, including urokinase, factor (F) VIII, and partially for FIX. Potential uses of platelet-directed therapeutics will be discussed, focusing on targeted delivery of urokinase as a thromboprophylactic agent and FVIII for the treatment of hemophilia A patients with intractable inhibitors. This presentation will discuss new strategies that may be useful in the care of patients with vascular injury as well as remaining challenges and limitations of these approaches. PMID:26149015

  10. Therapeutic perspectives

    PubMed Central

    Fiore, Carmelo E.; Pennisi, Pietra; Tinè, Marianna

    2008-01-01

    Osteoporosis and atherosclerosis are linked by biological association. This encourages the search for therapeutic strategies having both cardiovascular and skeletal beneficial effects. Among drugs that may concordantly enhance bone density and reduce the progression of atherosclerosis we can include bisphosphonates (BP), statins, β -blockers, and possibly anti-RANKL antibodies. Available data come from experimental animals and human studies. All these treatments however lack controlled clinical studies designed to demonstrate dual-action effects. PMID:22460845

  11. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  12. Model-based phase-shifting interferometer

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  13. Therapeutic alliance.

    PubMed

    Fox, Valerie

    2002-01-01

    I have been very fortunate in my journey of mental illness. I respond well to medication, but I don't think that is the complete answer to living successfully with serious, persistent mental illness. I believe a person's environment is also of utmost importance, enabling the person suffering with mental illness to continually grow in life. I found early in my struggle with mental illness a psychiatrist with whom I have always had a very good rapport. Until recently I didn't know that what I have with this psychiatrist is professionally known as a therapeutic alliance. Over the years, when I need someone to talk over anything that is troubling to me, I seek my psychiatrist. A therapeutic alliance is non-judgmental; it is nourishing; and finally it is a relationship of complete trust. Perhaps persons reading this article who have never experienced this alliance will seek it. I believe it can make an insecure person secure; a frightened person less frightened; and allow a person to continue the journey of mental health with a sense of belief in oneself. PMID:12433224

  14. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  15. Speech Correction in the Schools.

    ERIC Educational Resources Information Center

    Eisenson, Jon; Ogilvie, Mardel

    An introduction to the problems and therapeutic needs of school age children whose speech requires remedial attention, the text is intended for both the classroom teacher and the speech correctionist. General considerations include classification and incidence of speech defects, speech correction services, the teacher as a speaker, the mechanism…

  16. Jitter Correction

    NASA Technical Reports Server (NTRS)

    Waegell, Mordecai J.; Palacios, David M.

    2011-01-01

    Jitter_Correct.m is a MATLAB function that automatically measures and corrects inter-frame jitter in an image sequence to a user-specified precision. In addition, the algorithm dynamically adjusts the image sample size to increase the accuracy of the measurement. The Jitter_Correct.m function takes an image sequence with unknown frame-to-frame jitter and computes the translations of each frame (column and row, in pixels) relative to a chosen reference frame with sub-pixel accuracy. The translations are measured using a Cross Correlation Fourier transformation method in which the relative phase of the two transformed images is fit to a plane. The measured translations are then used to correct the inter-frame jitter of the image sequence. The function also dynamically expands the image sample size over which the cross-correlation is measured to increase the accuracy of the measurement. This increases the robustness of the measurement to variable magnitudes of inter-frame jitter

  17. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  18. Therapeutic Drug Monitoring

    MedlinePlus

    ... be limited. Home Visit Global Sites Search Help? Therapeutic Drug Monitoring Share this page: Was this page ... Monitored Drugs | Common Questions | Related Pages What is therapeutic drug monitoring? Therapeutic drug monitoring is the measurement ...

  19. A CORRECTION.

    PubMed

    Johnson, D

    1940-03-22

    IN a recently published volume on "The Origin of Submarine Canyons" the writer inadvertently credited to A. C. Veatch an excerpt from a submarine chart actually contoured by P. A. Smith, of the U. S. Coast and Geodetic Survey. The chart in question is Chart IVB of Special Paper No. 7 of the Geological Society of America entitled "Atlantic Submarine Valleys of the United States and the Congo Submarine Valley, by A. C. Veatch and P. A. Smith," and the excerpt appears as Plate III of the volume fist cited above. In view of the heavy labor involved in contouring the charts accompanying the paper by Veatch and Smith and the beauty of the finished product, it would be unfair to Mr. Smith to permit the error to go uncorrected. Excerpts from two other charts are correctly ascribed to Dr. Veatch. PMID:17839404

  20. Kitaev models based on unitary quantum groupoids

    SciTech Connect

    Chang, Liang

    2014-04-15

    We establish a generalization of Kitaev models based on unitary quantum groupoids. In particular, when inputting a Kitaev-Kong quantum groupoid H{sub C}, we show that the ground state manifold of the generalized model is canonically isomorphic to that of the Levin-Wen model based on a unitary fusion category C. Therefore, the generalized Kitaev models provide realizations of the target space of the Turaev-Viro topological quantum field theory based on C.

  1. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  2. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  3. Model-based satellite acquisition and tracking

    NASA Technical Reports Server (NTRS)

    Casasent, David; Lee, Andrew J.

    1988-01-01

    A model-based optical processor is introduced for the acquisition and tracking of a satellite in close proximity to an imaging sensor of a space robot. The type of satellite is known in advance, and a model of the satellite (which exists from its design) is used in this task. The model base is used to generate multiple smart filters of the various parts of the satellite, which are used in a symbolic multi-filter optical correlator. The output from the correlator is then treated as a symbolic description of the object, which is operated upon by an optical inference processor to determine the position and orientation of the satellite and to track it as a function of time. The knowledge and model base also serves to generate the rules used by the inference machine. The inference machine allows for feedback to optical correlators or feature extractors to locate the individual parts of the satellite and their orientations.

  4. Multimode model based defect characterization in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R.; Holland, S.; Gregory, E.

    2016-02-01

    A newly-initiated research program for model-based defect characterization in CFRP composites is summarized. The work utilizes computational models of the interaction of NDE probing energy fields (ultrasound and thermography), to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of performance-critical defect properties from analysis of measured NDE signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing delamination and porosity. Forward predictions of measurement response are presented, as well as examples of model-based inversion of measured data for the estimation of defect parameters.

  5. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  6. Model-Based Inquiries in Chemistry

    ERIC Educational Resources Information Center

    Khan, Samia

    2007-01-01

    In this paper, instructional strategies for sustaining model-based inquiry in an undergraduate chemistry class were analyzed through data collected from classroom observations, a student survey, and in-depth problem-solving sessions with the instructor and students. Analysis of teacher-student interactions revealed a cyclical pattern in which…

  7. What's Missing in Model-Based Teaching

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  8. Sandboxes for Model-Based Inquiry

    ERIC Educational Resources Information Center

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  9. Reducing Centroid Error Through Model-Based Noise Reduction

    NASA Technical Reports Server (NTRS)

    Lee, Shinhak

    2006-01-01

    A method of processing the digitized output of a charge-coupled device (CCD) image detector has been devised to enable reduction of the error in computed centroid of the image of a point source of light. The method involves model-based estimation of, and correction for, the contributions of bias and noise to the image data. The method could be used to advantage in any of a variety of applications in which there are requirements for measuring precise locations of, and/or precisely aiming optical instruments toward, point light sources. In the present method, prior to normal operations of the CCD, one measures the point-spread function (PSF) of the telescope or other optical system used to project images on the CCD. The PSF is used to construct a database of spot models representing the nominal CCD pixel outputs for a point light source projected onto the CCD at various positions incremented by small fractions of a pixel.

  10. Model-Based Systems Engineering Approach to Managing Mass Margin

    NASA Technical Reports Server (NTRS)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  11. Prediction model based on decision tree analysis for laccase mediators.

    PubMed

    Medina, Fabiola; Aguila, Sergio; Baratto, Maria Camilla; Martorana, Andrea; Basosi, Riccardo; Alderete, Joel B; Vazquez-Duhalt, Rafael

    2013-01-10

    A Structure Activity Relationship (SAR) study for laccase mediator systems was performed in order to correctly classify different natural phenolic mediators. Decision tree (DT) classification models with a set of five quantum-chemical calculated molecular descriptors were used. These descriptors included redox potential (ɛ°), ionization energy (E(i)), pK(a), enthalpy of formation of radical (Δ(f)H), and OH bond dissociation energy (D(O-H)). The rationale for selecting these descriptors is derived from the laccase-mediator mechanism. To validate the DT predictions, the kinetic constants of different compounds as laccase substrates, their ability for pesticide transformation as laccase-mediators, and radical stability were experimentally determined using Coriolopsis gallica laccase and the pesticide dichlorophen. The prediction capability of the DT model based on three proposed descriptors showed a complete agreement with the obtained experimental results. PMID:23199741

  12. A Model-Based System For Force Structure Analysis

    NASA Astrophysics Data System (ADS)

    Levitt, Tod S.; Kirby, Robert L.; Muller, Hans E.

    1985-04-01

    Given a set of image-derived vehicle detections and/or recognized military vehicles, SIGINT cues and a priori analysis of terrain, the force structure analysis (FSA) problem is to utilize knowledge of tactical doctrine and spatial deployment information to infer the existence of military forces such as batteries, companies, battalions, regiments, divisions, etc. A model-based system for FSA has been developed. It performs symbolic reasoning about force structures represented as geometric models. The FSA system is a stand-alone module which has also been developed as part of a larger system, the Advanced Digital Radar Image Exploitation System (ADRIES) for automated SAR image exploitation. The models recursively encode the component military units of a force structure, their expected spatial deployment, search priorities for model components, prior match probabilities, and type hierarchies for uncertain recognition. Partial and uncertain matching of models against data is the basic tool for building up hypotheses of the existence of force structures. Hypothesis management includes the functions of matching models against data, predicting the existence and location of unobserved force components, localization of search areas and resolution of conflicts between competing hypotheses. A subjective Bayesian inference calculus is used to accrue certainty of force structure hypotheses and resolve conflicts. Reasoning from uncertain vehicle level data, the system has successfully inferred the correct locations and components of force structures up to the battalion level. Key words: Force structure analysis, SAR, model-based reasoning, hypothesis management, search, matching, conflict resolution, Bayesian inference, uncertainty.

  13. Model-based Processing of Microcantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Sinensky, A K; Lee, C L; Rudd, R E; Burnham, A K

    2005-04-27

    We have developed a model-based processor (MBP) for a microcantilever-array sensor to detect target species in solution. We perform a proof-of-concept experiment, fit model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest, averaged deflection data and multi-channel data. For this evaluation we extract model parameters via a model-based estimation, perform a Gauss-Markov simulation, design the optimal MBP and apply it to measured experimental data. The performance of the MBP in the multi-channel case is evaluated by comparison to a ''smoother'' (averager) typically used for microcantilever signal analysis. It is shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, apart from a correctable systematic bias error.

  14. Therapeutic drug levels

    MedlinePlus

    ... medlineplus.gov/ency/article/003430.htm Therapeutic drug levels To use the sharing features on this page, please enable JavaScript. Therapeutic drug levels are lab tests to look for the presence ...

  15. Efficient Model-Based Diagnosis Engine

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  16. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  17. Enzyme therapeutics for systemic detoxification.

    PubMed

    Liu, Yang; Li, Jie; Lu, Yunfeng

    2015-08-01

    Life relies on numerous biochemical processes working synergistically and correctly. Certain substances disrupt these processes, inducing living organism into an abnormal state termed intoxication. Managing intoxication usually requires interventions, which is referred as detoxification. Decades of development on detoxification reveals the potential of enzymes as ideal therapeutics and antidotes, because their high substrate specificity and catalytic efficiency are essential for clearing intoxicating substances without adverse effects. However, intrinsic shortcomings of enzymes including low stability and high immunogenicity are major hurdles, which could be overcome by delivering enzymes with specially designed nanocarriers. Extensive investigations on protein delivery indicate three types of enzyme-nanocarrier architectures that show more promise than others for systemic detoxification, including liposome-wrapped enzymes, polymer-enzyme conjugates, and polymer-encapsulated enzymes. This review highlights recent advances in these nano-architectures and discusses their applications in systemic detoxifications. Therapeutic potential of various enzymes as well as associated challenges in achieving effective delivery of therapeutic enzymes will also be discussed. PMID:25980935

  18. Radiometric terrain correction of SPOT5 image

    NASA Astrophysics Data System (ADS)

    Feng, Xiuli; Zhang, Feng; Wang, Ke

    2007-06-01

    terrain correction model based on the rationale of moment matching is an effective model to reduce the shade effect than the traditional C correction approach, especially in the complex undulation of mountain area with lots of shade effect. In other words, the traditional C correction approach will show the better result at the plain area with less shade effect. Besides, the accuracy of the DEM data and the registration accuracy between the image and the DEM data will also influence the final correction accuracy. In order to achieve the higher radiometric terrain correction, high spatial resolution DEM data is preferred.

  19. Model-based pulse shape correction for CdTe detectors

    NASA Astrophysics Data System (ADS)

    Bargholtz, Chr.; Fumero, E.; Mårtensson, L.

    1999-02-01

    We present a systematic method to improve energy resolution of CdTe-detector systems with full control of the efficiency. Sampled pulses and multiple amplifier data are fitted by a model of the pulse shape including the deposited energy and the interaction point within the detector as parameters. We show the decisive improvements of spectral resolution and photo-peak efficiency that is obtained without distortion of spectral shape. The information concerning the interaction depth of individual events can be used to discriminate between beta particles and gamma quanta.

  20. MODEL-BASED IMAGE RECONSTRUCTION FOR MRI

    PubMed Central

    Fessler, Jeffrey A.

    2010-01-01

    Magnetic resonance imaging (MRI) is a sophisticated and versatile medical imaging modality. Traditionally, MR images are reconstructed from the raw measurements by a simple inverse 2D or 3D fast Fourier transform (FFT). However, there are a growing number of MRI applications where a simple inverse FFT is inadequate, e.g., due to non-Cartesian sampling patterns, non-Fourier physical effects, nonlinear magnetic fields, or deliberate under-sampling to reduce scan times. Such considerations have led to increasing interest in methods for model-based image reconstruction in MRI. PMID:21135916

  1. Model-based multiple patterning layout decomposition

    NASA Astrophysics Data System (ADS)

    Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.

    2015-10-01

    As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this

  2. Vector space model based on semantic relatedness

    NASA Astrophysics Data System (ADS)

    Bondarchuk, Dmitry; Timofeeva, Galina

    2015-11-01

    Most of data-mining methods are based on the vector space model of knowledge representation. The vector space model uses the frequency of a term in order to determine its relevance in a document. Terms can be similar by semantic meaning but be lexicographically different ones, so the classification based on the frequency of terms does not give desired results in some subject areas such as the vacancies selection. The modified vector space model based on the semantic relatedness is suggested for data-mining in this area. Evaluation results show that the proposed algorithm is better then one based on the standard vector space model.

  3. Model-based Tomographic Reconstruction Literature Search

    SciTech Connect

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  4. Model-based vision using geometric hashing

    NASA Astrophysics Data System (ADS)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  5. Student Modeling Based on Problem Solving Times

    ERIC Educational Resources Information Center

    Pelánek, Radek; Jarušek, Petr

    2015-01-01

    Student modeling in intelligent tutoring systems is mostly concerned with modeling correctness of students' answers. As interactive problem solving activities become increasingly common in educational systems, it is useful to focus also on timing information associated with problem solving. We argue that the focus on timing is natural for certain…

  6. Sandboxes for Model-Based Inquiry

    NASA Astrophysics Data System (ADS)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  7. Model based 3D segmentation and OCT image undistortion of percutaneous implants.

    PubMed

    Müller, Oliver; Donner, Sabine; Klinder, Tobias; Dragon, Ralf; Bartsch, Ivonne; Witte, Frank; Krüger, Alexander; Heisterkamp, Alexander; Rosenhahn, Bodo

    2011-01-01

    Optical Coherence Tomography (OCT) is a noninvasive imaging technique which is used here for in vivo biocompatibility studies of percutaneous implants. A prerequisite for a morphometric analysis of the OCT images is the correction of optical distortions caused by the index of refraction in the tissue. We propose a fully automatic approach for 3D segmentation of percutaneous implants using Markov random fields. Refraction correction is done by using the subcutaneous implant base as a prior for model based estimation of the refractive index using a generalized Hough transform. Experiments show the competitiveness of our algorithm towards manual segmentations done by experts. PMID:22003731

  8. Model-based Processing of Micro-cantilever Sensor Arrays

    SciTech Connect

    Tringe, J W; Clague, D S; Candy, J V; Lee, C L; Rudd, R E; Burnham, A K

    2004-11-17

    We develop a model-based processor (MBP) for a micro-cantilever array sensor to detect target species in solution. After discussing the generalized framework for this problem, we develop the specific model used in this study. We perform a proof-of-concept experiment, fit the model parameters to the measured data and use them to develop a Gauss-Markov simulation. We then investigate two cases of interest: (1) averaged deflection data, and (2) multi-channel data. In both cases the evaluation proceeds by first performing a model-based parameter estimation to extract the model parameters, next performing a Gauss-Markov simulation, designing the optimal MBP and finally applying it to measured experimental data. The simulation is used to evaluate the performance of the MBP in the multi-channel case and compare it to a ''smoother'' (''averager'') typically used in this application. It was shown that the MBP not only provides a significant gain ({approx} 80dB) in signal-to-noise ratio (SNR), but also consistently outperforms the smoother by 40-60 dB. Finally, we apply the processor to the smoothed experimental data and demonstrate its capability for chemical detection. The MBP performs quite well, though it includes a correctable systematic bias error. The project's primary accomplishment was the successful application of model-based processing to signals from micro-cantilever arrays: 40-60 dB improvement vs. the smoother algorithm was demonstrated. This result was achieved through the development of appropriate mathematical descriptions for the chemical and mechanical phenomena, and incorporation of these descriptions directly into the model-based signal processor. A significant challenge was the development of the framework which would maximize the usefulness of the signal processing algorithms while ensuring the accuracy of the mathematical description of the chemical-mechanical signal. Experimentally, the difficulty was to identify and characterize the non

  9. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  10. Therapeutic Recreation Practicum Manual.

    ERIC Educational Resources Information Center

    Schneegas, Kay

    This manual provides information on the practicum program offered by Moraine Valley Community College (MVCC) for students in its therapeutic recreation program. Sections I and II outline the rationale and goals for providing practical, on-the-job work experiences for therapeutic recreation students. Section III specifies MVCC's responsibilities…

  11. Chicanoizing the Therapeutic Community

    ERIC Educational Resources Information Center

    Aron, William S.; And Others

    1974-01-01

    Focusing on the drug addiction problem and its antecedent conditions in a Chicano population, the article examines several therapeutic interventions suggested by these conditions and indicates how they might be incorporated into a drug addiction Therapeutic Community treatment program designed to meet the needs of Chicano drug addicts. (Author/NQ)

  12. 77 FR 72199 - Technical Corrections; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ...) is correcting a final rule that was published in the Federal Register on July 6, 2012 (77 FR 39899... . SUPPLEMENTARY INFORMATION: On July 6, 2012 (77 FR 39899), the NRC published a final rule in the Federal Register... typographical and spelling errors, and making other edits and conforming changes. This correcting amendment...

  13. Rx for Pedagogical Correctness: Professional Correctness.

    ERIC Educational Resources Information Center

    Lasley, Thomas J.

    1993-01-01

    Describes the difficulties caused by educators holding to a view of teaching that assumes that there is one "pedagogically correct" way of running a classroom. Provides three examples of harmful pedagogical correctness ("untracked" classes, cooperative learning, and testing and test-wiseness). Argues that such dogmatic views of education limit…

  14. Concept Modeling-based Drug Repositioning

    PubMed Central

    Patchala, Jagadeesh; Jegga, Anil G

    2015-01-01

    Our hypothesis is that drugs and diseases sharing similar biomedical and genomic concepts are likely to be related, and thus repositioning opportunities can be identified by ranking drugs based on the incidence of shared similar concepts with diseases and vice versa. To test this, we constructed a probabilistic topic model based on the Unified Medical Language System (UMLS) concepts that appear in the disease and drug related abstracts in MEDLINE. The resulting probabilistic topic associations were used to measure the similarity between disease and drugs. The success of the proposed model is evaluated using a set of repositioned drugs, and comparing a drug’s ranking based on its similarity to the original and new indication. We then applied the model to rare disorders and compared them to all approved drugs to facilitate “systematically serendipitous” discovery of relationships between rare diseases and existing drugs, some of which could be potential repositioning candidates. PMID:26306277

  15. Model Based Reconstruction of UT Array Data

    NASA Astrophysics Data System (ADS)

    Calmon, P.; Iakovleva, E.; Fidahoussen, A.; Ribay, G.; Chatillon, S.

    2008-02-01

    Beyond the detection of defects, their characterization (identification, positioning, sizing) is one goal of great importance often assigned to the analysis of NDT data. The first step of such analysis in the case of ultrasonic testing amounts to image in the part the detected echoes. This operation is in general achieved by considering time of flights and by applying simplified algorithms which are often valid only on canonical situations. In this communication we present an overview of different imaging techniques studied at CEA LIST and based on the exploitation of direct models which enable to address complex configurations and are available in the CIVA software plat-form. We discuss in particular ray-model based algorithms, algorithms derived from classical synthetic focusing and processing of the full inter-element matrix (MUSIC algorithm).

  16. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  17. Model-based vision for space applications

    NASA Technical Reports Server (NTRS)

    Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald

    1992-01-01

    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.

  18. Model-Based Vision Using Relational Summaries

    NASA Astrophysics Data System (ADS)

    Lu, Haiyuan; Shapiro, Linda G.

    1989-03-01

    A CAD-to-vision system is a computer system that inputs a CAD model of an object and outputs a vision model and matching procedure by which that object can be recognized and/or its position and orientation determined. CAD-model-based systems are extremely useful for industrial vision tasks where a number of different manufactured parts must be automatically manipulated and/or inspected. Another area where vision systems based on CAD models is becoming important is in the United States space program. Since the space station and space vehicles are recent or even current designs, we can expect to have CAD models of these objects to work with. Vision tasks in space such as docking and tracking of vehicles, guided assembly tasks, and inspection of the space station itself for cracks and other problems can rely on model-directed vision techniques.

  19. Model-based reconfiguration: Diagnosis and recovery

    NASA Technical Reports Server (NTRS)

    Crow, Judy; Rushby, John

    1994-01-01

    We extend Reiter's general theory of model-based diagnosis to a theory of fault detection, identification, and reconfiguration (FDIR). The generality of Reiter's theory readily supports an extension in which the problem of reconfiguration is viewed as a close analog of the problem of diagnosis. Using a reconfiguration predicate 'rcfg' analogous to the abnormality predicate 'ab,' we derive a strategy for reconfiguration by transforming the corresponding strategy for diagnosis. There are two obvious benefits of this approach: algorithms for diagnosis can be exploited as algorithms for reconfiguration and we have a theoretical framework for an integrated approach to FDIR. As a first step toward realizing these benefits we show that a class of diagnosis engines can be used for reconfiguration and we discuss algorithms for integrated FDIR. We argue that integrating recovery and diagnosis is an essential next step if this technology is to be useful for practical applications.

  20. Cytokines and therapeutic oligonucleotides.

    PubMed

    Hartmann, G; Bidlingmaier, M; Eigler, A; Hacker, U; Endres, S

    1997-12-01

    Therapeutic oligonucleotides - short strands of synthetic nucleic acids - encompass antisense and aptamer oligonucleotides. Antisense oligonucleotides are designed to bind to target RNA by complementary base pairing and to inhibit translation of the target protein. Antisense oligonucleotides enable specific inhibition of cytokine synthesis. In contrast, aptamer oligonucleotides are able to bind directly to specific proteins. This binding depends on the sequence of the oligonucleotide. Aptamer oligonucleotides with CpG motifs can exert strong immunostimulatory effects. Both kinds of therapeutic oligonucleotides - antisense and aptamer oligonucleotides - provide promising tools to modulate immunological functions. Recently, therapeutic oligonucleotides have moved towards clinical application. An antisense oligonucleotide directed against the proinflammatory intercellular adhesion molecule 1 (ICAM-1) is currently being tested in clinical trials for therapy of inflammatory disease. Immunostimulatory aptamer oligonucleotides are in preclinical development for immunotherapy. In the present review we summarize the application of therapeutic oligonucleotides to modulate immunological functions. We include technological aspects as well as current therapeutic concepts and clinical studies. PMID:9740353

  1. Model-based ocean acoustic passive localization

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-01-24

    The detection, localization and classification of acoustic sources (targets) in a hostile ocean environment is a difficult problem -- especially in light of the improved design of modern submarines and the continual improvement in quieting technology. Further the advent of more and more diesel-powered vessels makes the detection problem even more formidable than ever before. It has recently been recognized that the incorporation of a mathematical model that accurately represents the phenomenology under investigation can vastly improve the performance of any processor, assuming, of course, that the model is accurate. Therefore, it is necessary to incorporate more knowledge about the ocean environment into detection and localization algorithms in order to enhance the overall signal-to-noise ratios and improve performance. An alternative methodology to matched-field/matched-mode processing is the so-called model-based processor which is based on a state-space representation of the normal-mode propagation model. If state-space solutions can be accomplished, then many of the current ocean acoustic processing problems can be analyzed and solved using this framework to analyze performance results based on firm statistical and system theoretic grounds. The model-based approach, is (simply) ``incorporating mathematical models of both physical phenomenology and the measurement processes including noise into the processor to extract the desired information.`` In this application, we seek techniques to incorporate the: (1) ocean acoustic propagation model; (2) sensor array measurement model; and (3) noise models (ambient, shipping, surface and measurement) into a processor to solve the associated localization/detection problems.

  2. Fast Algorithms for Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan

    2005-01-01

    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  3. Reporting therapeutic discourse in a therapeutic community.

    PubMed

    Chapman, G E

    1988-03-01

    Research in nurses' communications has concentrated on nurse to patient interactions. Those few studies which focus on nurse to nurse communications seem to be generated by a pragmatic and normative concern with effective information sharing. In this paper, which describes one aspect of a larger case study of a hospital-based therapeutic community, the description and analysis of nurses' reports flows not from a normative model of professional practice, but rather an exploration of how professional practice is articulated as discourse in nurses' written accounts. Foucault's ideas about therapeutic discourse inform the theoretical framework of the research. Ethnomethodological concerns with the importance of documentary analysis provide the methodological rationale for examining nurses' 24-hour report documents, as official discourse, reflecting therapeutic practice in this setting. A content analysis of nurses' reports, collected over a period of 4 months, demonstrated the importance of domesticity and ordinary everyday activities in nurses' accounts of hospital life. Disruption to the 'life as usual' domesticity in the community seemed to be associated with admission to and discharge from the hospital when interpersonal and interactional changes between patients occur. It is suggested that nurses in general hospital wards and more orthodox psychiatric settings might usefully consider the impact of admissions and discharges on the group of patients they manage, and make this a discursive focus of their work. PMID:3372900

  4. An application of model-based reasoning to accounting systems

    SciTech Connect

    Nado, R.; Chams, M.; Delisio, J.; Hamscher, W.

    1996-12-31

    An important problem faced by auditors is gauging how much reliance can be placed on the accounting systems that process millions of transactions to produce the numbers summarized in a company`s financial statements. Accounting systems contain internal controls, procedures designed to detect and correct errors and irregularities that may occur in the processing of transactions. In a complex accounting system, it can be an extremely difficult task for the auditor to anticipate the possible errors that can occur and to evaluate the effectiveness of the controls at detecting them. An accurate analysis must take into account the unique features of each company`s business processes. To cope with this complexity and variability, the Comet system applies a model-based reasoning approach to the analysis of accounting systems and their controls. An auditor uses Comet to create a hierarchical flowchart model that describes the intended processing of business transactions by an accounting system and the operation of its controls. Comet uses the constructed model to automatically analyze the effectiveness of the controls in detecting potential errors. Price Waterhouse auditors have used Comet on a variety of real audits in several countries around the world.

  5. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  6. [Therapeutic neuromodulation in primary headaches].

    PubMed

    May, A; Jürgens, T P

    2011-06-01

    Neuromodulatory techniques have developed rapidly in the therapeutic management of refractory headaches. Invasive procedures comprise peripheral nerve stimulation (particularly occipital nerve stimulation), vagus nerve stimulation, cervical spinal cord stimulation and hypothalamic deep brain stimulation. Transcutaneous electrical nerve stimulation, repetitive transcranial magnetic stimulation and transcranial direct current stimulation are noninvasive variants. Based on current neuroimaging, neurophysiological and clinical studies occipital nerve stimulation and hypothalamic deep brain stimulation are recommended for patients with chronic cluster headache. Less convincing evidence can be found for their use in other refractory headaches such as chronic migraine. No clear recommendation can be given for the other neuromodulatory techniques. The emerging concept of intermittent stimulation of the sphenopalatine ganglion is nonetheless promising. Robust randomized and sham-controlled multicenter studies are needed before these therapeutic approaches are widely implemented. Due to the experimental nature all patients should be treated in clinical studies. It is essential to confirm the correct headache diagnosis and the refractory nature before an invasive approach is considered. Patients should generally be referred to specialized interdisciplinary outpatient departments which closely collaborate with neurosurgeons who are experienced in the implantation of neuromodulatory devices. It is crucial to ensure a competent postoperative follow-up with optimization of stimulation parameters and adjustment of medication. PMID:20972665

  7. Biomimetic Particles as Therapeutics

    PubMed Central

    Green, Jordan J.

    2015-01-01

    In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289

  8. An overview of correctional psychiatry.

    PubMed

    Metzner, Jeffrey; Dvoskin, Joel

    2006-09-01

    Supermax facilities may be an unfortunate and unpleasant necessity in modern corrections. Because of the serious dangers posed by prison gangs, they are unlikely to disappear completely from the correctional landscape any time soon. But such units should be carefully reserved for those inmates who pose the most serious danger to the prison environment. Further, the constitutional duty to provide medical and mental health care does not end at the supermax door. There is a great deal of common ground between the opponents of such environments and those who view them as a necessity. No one should want these expensive beds to be used for people who could be more therapeutically and safely managed in mental health treatment environments. No one should want people with serious mental illnesses to be punished for their symptoms. Finally, no one wants these units to make people more, instead of less, dangerous. It is in everyone's interests to learn as much as possible about the potential of these units for good and for harm. Corrections is a profession, and professions base their practices on data. If we are to avoid the most egregious and harmful effects of supermax confinement, we need to understand them far better than we currently do. Though there is a role for advocacy from those supporting or opposed to such environments, there is also a need for objective, scientifically rigorous study of these units and the people who live there. PMID:16904510

  9. Is the concept of corrective emotional experience still topical?

    PubMed

    Palvarini, Paolo

    2010-01-01

    This article gives a historical review of the literature concerned with the role of emotional factors in psychoanalysis. The author then focuses on Alexander's milestone contribution and above all, on the concept he developed of corrective emotional experience. Alexander moves gradually over time from the classical position, which gives insight a place of choice, to a more radical view, in which, the most effective therapeutic factor is represented by the emotional experience within the therapeutic relationship. The article includes a review of the literature relevant to the concept of corrective emotional experience. Finally, Experiential-Dynamic Psychotherapy, a therapeutic approach giving a prominent role to the therapeutic power of corrective emotional experience is considered. Two vignettes from a psychotherapy carried out according to the principles of Experiential-Dynamic Psychotherapy provide examples of how this model is applied clinically. PMID:20617789

  10. Eyeglasses for Vision Correction

    MedlinePlus

    ... Stories Español Eye Health / Glasses & Contacts Eyeglasses for Vision Correction Dec. 12, 2015 Wearing eyeglasses is an easy way to correct refractive errors. Improving your vision with eyeglasses offers the opportunity to select from ...

  11. Engineering antibody therapeutics.

    PubMed

    Chiu, Mark L; Gilliland, Gary L

    2016-06-01

    The successful introduction of antibody-based protein therapeutics into the arsenal of treatments for patients has within a few decades fostered intense innovation in the production and engineering of antibodies. Reviewed here are the methods currently used to produce antibodies along with how our knowledge of the structural and functional characterization of immunoglobulins has resulted in the engineering of antibodies to produce protein therapeutics with unique properties, both biological and biophysical, that are leading to novel therapeutic approaches. Antibody engineering includes the introduction of the antibody combining site (variable regions) into a host of architectures including bi and multi-specific formats that further impact the therapeutic properties leading to further advantages and successes in patient treatment. PMID:27525816

  12. [Fast spectral modeling based on Voigt peaks].

    PubMed

    Li, Jin-rong; Dai, Lian-kui

    2012-03-01

    Indirect hard modeling (IHM) is a recently introduced method for quantitative spectral analysis, which was applied to the analysis of nonlinear relation between mixture spectrum and component concentration. In addition, IHM is an effectual technology for the analysis of components of mixture with molecular interactions and strongly overlapping bands. Before the establishment of regression model, IHM needs to model the measured spectrum as a sum of Voigt peaks. The precision of the spectral model has immediate impact on the accuracy of the regression model. A spectrum often includes dozens or even hundreds of Voigt peaks, which mean that spectral modeling is a optimization problem with high dimensionality in fact. So, large operation overhead is needed and the solution would not be numerically unique due to the ill-condition of the optimization problem. An improved spectral modeling method is presented in the present paper, which reduces the dimensionality of optimization problem by determining the overlapped peaks in spectrum. Experimental results show that the spectral modeling based on the new method is more accurate and needs much shorter running time than conventional method. PMID:22582612

  13. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  14. Model-based target and background characterization

    NASA Astrophysics Data System (ADS)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  15. Enhancing model based forecasting of geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Webb, Alla G.

    Modern society is increasingly dependent on the smooth operation of large scale technology supporting Earth based activities such as communication, electricity distribution, and navigation. This technology is potentially threatened by global geomagnetic storms, which are caused by the impact of plasma ejected from the Sun upon the protective magnetic field that surrounds the Earth. Forecasting the timing and magnitude of these geomagnetic storms is part of the emerging discipline of space weather. The most severe geomagnetic storms are caused by magnetic clouds, whose properties and characteristics are important variables in space weather forecasting systems. The methodology presented here is the development of a new statistical approach to characterize the physical properties (variables) of the magnetic clouds and to examine the extent to which theoretical models can be used in describing both of these physical properties, as well as their evolution in space and time. Since space weather forecasting is a complex system, a systems engineering approach is used to perform analysis, validation, and verification of the magnetic cloud models (subsystem of the forecasting system) using a model-based methodology. This research demonstrates that in order to validate magnetic cloud models, it is important to categorize the data by physical parameters such as velocity and distance travelled. This understanding will improve the modeling accuracy of magnetic clouds in space weather forecasting systems and hence increase forecasting accuracy of geomagnetic storms and their impact on earth systems.

  16. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  17. Research in Correctional Rehabilitation.

    ERIC Educational Resources Information Center

    Rehabilitation Services Administration (DHEW), Washington, DC.

    Forty-three leaders in corrections and rehabilitation participated in the seminar planned to provide an indication of the status of research in correctional rehabilitation. Papers include: (1) "Program Trends in Correctional Rehabilitation" by John P. Conrad, (2) "Federal Offenders Rahabilitation Program" by Percy B. Bell and Merlyn Mathews, (3)…

  18. An operator model-based filtering scheme

    SciTech Connect

    Sawhney, R.S.; Dodds, H.L. ); Schryer, J.C. )

    1990-01-01

    This paper presents a diagnostic model developed at Oak Ridge National Laboratory (ORNL) for off-normal nuclear power plant events. The diagnostic model is intended to serve as an embedded module of a cognitive model of the human operator, one application of which could be to assist control room operators in correctly responding to off-normal events by providing a rapid and accurate assessment of alarm patterns and parameter trends. The sequential filter model is comprised of two distinct subsystems --- an alarm analysis followed by an analysis of interpreted plant signals. During the alarm analysis phase, the alarm pattern is evaluated to generate hypotheses of possible initiating events in order of likelihood of occurrence. Each hypothesis is further evaluated through analysis of the current trends of state variables in order to validate/reject (in the form of increased/decreased certainty factor) the given hypothesis. 7 refs., 4 figs.

  19. Corrective Action Glossary

    SciTech Connect

    Not Available

    1992-07-01

    The glossary of technical terms was prepared to facilitate the use of the Corrective Action Plan (CAP) issued by OSWER on November 14, 1986. The CAP presents model scopes of work for all phases of a corrective action program, including the RCRA Facility Investigation (RFI), Corrective Measures Study (CMS), Corrective Measures Implementation (CMI), and interim measures. The Corrective Action Glossary includes brief definitions of the technical terms used in the CAP and explains how they are used. In addition, expected ranges (where applicable) are provided. Parameters or terms not discussed in the CAP, but commonly associated with site investigations or remediations are also included.

  20. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  1. Development of explicit diffraction corrections for absolute measurements of acoustic nonlinearity parameters in the quasilinear regime.

    PubMed

    Jeong, Hyunjo; Zhang, Shuzeng; Cho, Sungjong; Li, Xiongbing

    2016-08-01

    In absolute measurements of acoustic nonlinearity parameters, amplitudes of harmonics must be corrected for diffraction effects. In this study, we develop explicit multi-Gaussian beam (MGB) model-based diffraction corrections for the first three harmonics in weakly nonlinear, axisymmetric sound beams. The effects of making diffraction corrections on nonlinearity parameter estimation are investigated by defining "total diffraction correction (TDC)". The results demonstrate that TDC cannot be neglected even for harmonic generation experiments in the nearfield region. PMID:27186964

  2. A model-based approach for making ecological inference from distance sampling data.

    PubMed

    Johnson, Devin S; Laake, Jeffrey L; Ver Hoef, Jay M

    2010-03-01

    We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike's information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function. PMID:19459840

  3. Therapeutic Antioxidant Medical Gas

    PubMed Central

    Nakao, Atsunori; Sugimoto, Ryujiro; Billiar, Timothy R; McCurry, Kenneth R

    2009-01-01

    Medical gases are pharmaceutical gaseous molecules which offer solutions to medical needs and include traditional gases, such as oxygen and nitrous oxide, as well as gases with recently discovered roles as biological messenger molecules, such as carbon monoxide, nitric oxide and hydrogen sulphide. Medical gas therapy is a relatively unexplored field of medicine; however, a recent increasing in the number of publications on medical gas therapies clearly indicate that there are significant opportunities for use of gases as therapeutic tools for a variety of disease conditions. In this article, we review the recent advances in research on medical gases with antioxidant properties and discuss their clinical applications and therapeutic properties. PMID:19177183

  4. Therapeutics for cognitive aging

    PubMed Central

    Shineman, Diana W.; Salthouse, Timothy A.; Launer, Lenore J.; Hof, Patrick R.; Bartzokis, George; Kleiman, Robin; Luine, Victoria; Buccafusco, Jerry J.; Small, Gary W.; Aisen, Paul S.; Lowe, David A.; Fillit, Howard M.

    2011-01-01

    This review summarizes the scientific talks presented at the conference “Therapeutics for Cognitive Aging,” hosted by the New York Academy of Sciences and the Alzheimer’s Drug Discovery Foundation on May 15, 2009. Attended by scientists from industry and academia, as well as by a number of lay people—approximately 200 in all—the conference specifically tackled the many aspects of developing therapeutic interventions for cognitive impairment. Discussion also focused on how to define cognitive aging and whether it should be considered a treatable, tractable disease. PMID:20392284

  5. Advances in Therapeutic Cholangioscopy

    PubMed Central

    Moura, Renata Nobre; de Moura, Eduardo Guimarães Hourneaux

    2016-01-01

    Nowadays, cholangioscopy is an established modality in diagnostic and treatment of pancreaticobiliary diseases. The more widespread use and the recent development of new technologies and accessories had renewed the interest of endoscopic visualization of the biliary tract, increasing the range of indications and therapeutic procedures, such as diagnostic of indeterminate biliary strictures, lithotripsy of difficult bile duct stones, ablative techniques for intraductal malignancies, removal of foreign bodies and gallbladder drainage. These endoscopic interventions will probably be the last frontier in the near future. This paper presents the new advances in therapeutic cholangioscopy, focusing on the current clinical applications and on research areas. PMID:27403156

  6. DELIVERY OF THERAPEUTIC PROTEINS

    PubMed Central

    Pisal, Dipak S.; Kosloski, Matthew P.; Balu-Iyer, Sathy V.

    2009-01-01

    The safety and efficacy of protein therapeutics are limited by three interrelated pharmaceutical issues, in vitro and in vivo instability, immunogenicity and shorter half-lives. Novel drug modifications for overcoming these issues are under investigation and include covalent attachment of poly(ethylene glycol) (PEG), polysialic acid, or glycolic acid, as well as developing new formulations containing nanoparticulate or colloidal systems (e.g. liposomes, polymeric microspheres, polymeric nanoparticles). Such strategies have the potential to develop as next generation protein therapeutics. This review includes a general discussion on these delivery approaches. PMID:20049941

  7. MACE: model based analysis of ChIP-exo.

    PubMed

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J; Zimmermann, Michael T; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T; Huang, Haojie; Wilson, Michael D; Kocher, Jean-Pierre A; Li, Wei

    2014-11-10

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  8. MACE: model based analysis of ChIP-exo

    PubMed Central

    Wang, Liguo; Chen, Junsheng; Wang, Chen; Uusküla-Reimand, Liis; Chen, Kaifu; Medina-Rivera, Alejandra; Young, Edwin J.; Zimmermann, Michael T.; Yan, Huihuang; Sun, Zhifu; Zhang, Yuji; Wu, Stephen T.; Huang, Haojie; Wilson, Michael D.; Kocher, Jean-Pierre A.; Li, Wei

    2014-01-01

    Understanding the role of a given transcription factor (TF) in regulating gene expression requires precise mapping of its binding sites in the genome. Chromatin immunoprecipitation-exo, an emerging technique using λ exonuclease to digest TF unbound DNA after ChIP, is designed to reveal transcription factor binding site (TFBS) boundaries with near-single nucleotide resolution. Although ChIP-exo promises deeper insights into transcription regulation, no dedicated bioinformatics tool exists to leverage its advantages. Most ChIP-seq and ChIP-chip analytic methods are not tailored for ChIP-exo, and thus cannot take full advantage of high-resolution ChIP-exo data. Here we describe a novel analysis framework, termed MACE (model-based analysis of ChIP-exo) dedicated to ChIP-exo data analysis. The MACE workflow consists of four steps: (i) sequencing data normalization and bias correction; (ii) signal consolidation and noise reduction; (iii) single-nucleotide resolution border peak detection using the Chebyshev Inequality and (iv) border matching using the Gale-Shapley stable matching algorithm. When applied to published human CTCF, yeast Reb1 and our own mouse ONECUT1/HNF6 ChIP-exo data, MACE is able to define TFBSs with high sensitivity, specificity and spatial resolution, as evidenced by multiple criteria including motif enrichment, sequence conservation, direct sequence pileup, nucleosome positioning and open chromatin states. In addition, we show that the fundamental advance of MACE is the identification of two boundaries of a TFBS with high resolution, whereas other methods only report a single location of the same event. The two boundaries help elucidate the in vivo binding structure of a given TF, e.g. whether the TF may bind as dimers or in a complex with other co-factors. PMID:25249628

  9. 78 FR 75449 - Miscellaneous Corrections; Corrections

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... INFORMATION: The NRC published a final rule in the Federal Register on June 7, 2013 (78 FR 34245), to make.... The final rule contained minor errors in grammar, punctuation, and referencing. This document corrects... specifying metric units. The final rule inadvertently included additional errors in grammar and...

  10. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  11. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  12. 75 FR 68407 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ... 67013, the Presidential Determination number should read ``2010-12'' (Presidential Sig.) [FR Doc. C1... Migration Needs Resulting from Violence in Kyrgyzstan Correction In Presidential document...

  13. On prismatic corrections

    NASA Astrophysics Data System (ADS)

    Bartkowski, Zygmunt; Bartkowska, Janina

    2006-02-01

    In the prismatic corrections there are described the differences between the nominal and interior prisms, or tilts of the eye to fix straightforward (Augenausgleichbewegung). In the astigmatic corrections, if the prism doesn't lie in the principal sections of the cylinder, the directions of both events are different. In the corrections of the horizontal strabismus there appears the vertical component of the interior prism. The approximated formulae describing these phenomena are presented. The suitable setting can correct the quality of the vision in the important for the patient direction.

  14. Antibody Therapeutics in Oncology

    PubMed Central

    Wold, Erik D; Smider, Vaughn V; Felding, Brunhilde H

    2016-01-01

    One of the newer classes of targeted cancer therapeutics is monoclonal antibodies. Monoclonal antibody therapeutics are a successful and rapidly expanding drug class due to their high specificity, activity, favourable pharmacokinetics, and standardized manufacturing processes. Antibodies are capable of recruiting the immune system to attack cancer cells through complement-dependent cytotoxicity or antibody dependent cellular cytotoxicity. In an ideal scenario the initial tumor cell destruction induced by administration of a therapeutic antibody can result in uptake of tumor associated antigens by antigen-presenting cells, establishing a prolonged memory effect. Mechanisms of direct tumor cell killing by antibodies include antibody recognition of cell surface bound enzymes to neutralize enzyme activity and signaling, or induction of receptor agonist or antagonist activity. Both approaches result in cellular apoptosis. In another and very direct approach, antibodies are used to deliver drugs to target cells and cause cell death. Such antibody drug conjugates (ADCs) direct cytotoxic compounds to tumor cells, after selective binding to cell surface antigens, internalization, and intracellular drug release. Efficacy and safety of ADCs for cancer therapy has recently been greatly advanced based on innovative approaches for site-specific drug conjugation to the antibody structure. This technology enabled rational optimization of function and pharmacokinetics of the resulting conjugates, and is now beginning to yield therapeutics with defined, uniform molecular characteristics, and unprecedented promise to advance cancer treatment. PMID:27081677

  15. Therapeutic Recombinant Monoclonal Antibodies

    ERIC Educational Resources Information Center

    Bakhtiar, Ray

    2012-01-01

    During the last two decades, the rapid growth of biotechnology-derived techniques has led to a myriad of therapeutic recombinant monoclonal antibodies with significant clinical benefits. Recombinant monoclonal antibodies can be obtained from a number of natural sources such as animal cell cultures using recombinant DNA engineering. In contrast to…

  16. Models-Based Practice: Great White Hope or White Elephant?

    ERIC Educational Resources Information Center

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  17. Model-Based Software Testing for Object-Oriented Software

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  18. Learning of Chemical Equilibrium through Modelling-Based Teaching

    ERIC Educational Resources Information Center

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students learning…

  19. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  20. Emerging Mitochondrial Therapeutic Targets in Optic Neuropathies.

    PubMed

    Lopez Sanchez, M I G; Crowston, J G; Mackey, D A; Trounce, I A

    2016-09-01

    Optic neuropathies are an important cause of blindness worldwide. The study of the most common inherited mitochondrial optic neuropathies, Leber hereditary optic neuropathy (LHON) and autosomal dominant optic atrophy (ADOA) has highlighted a fundamental role for mitochondrial function in the survival of the affected neuron-the retinal ganglion cell. A picture is now emerging that links mitochondrial dysfunction to optic nerve disease and other neurodegenerative processes. Insights gained from the peculiar susceptibility of retinal ganglion cells to mitochondrial dysfunction are likely to inform therapeutic development for glaucoma and other common neurodegenerative diseases of aging. Despite it being a fast-evolving field of research, a lack of access to human ocular tissues and limited animal models of mitochondrial disease have prevented direct retinal ganglion cell experimentation and delayed the development of efficient therapeutic strategies to prevent vision loss. Currently, there are no approved treatments for mitochondrial disease, including optic neuropathies caused by primary or secondary mitochondrial dysfunction. Recent advances in eye research have provided important insights into the molecular mechanisms that mediate pathogenesis, and new therapeutic strategies including gene correction approaches are currently being investigated. Here, we review the general principles of mitochondrial biology relevant to retinal ganglion cell function and provide an overview of the major optic neuropathies with mitochondrial involvement, LHON and ADOA, whilst highlighting the emerging link between mitochondrial dysfunction and glaucoma. The pharmacological strategies currently being trialed to improve mitochondrial dysfunction in these optic neuropathies are discussed in addition to emerging therapeutic approaches to preserve retinal ganglion cell function. PMID:27288727

  1. Therapeutic antibodies against cancer

    PubMed Central

    Adler, Mark J.; Dimitrov, Dimiter S.

    2012-01-01

    Antibody-based therapeutics against cancer are highly successful in clinic and currently enjoy unprecedented recognition of their potential; 13 monoclonal antibodies (mAbs) have been approved for clinical use in the European Union and in the United States (one, mylotarg, was withdrawn from market in 2010). Three of the mAbs (bevacizumab, rituximab, trastuzumab) are in the top six selling protein therapeutics with sales in 2010 of more than $5 bln each. Hundreds of mAbs including bispecific mAbs and multispecific fusion proteins, mAbs conjugated with small molecule drugs and mAbs with optimized pharmacokinetics are in clinical trials. However, challenges remain and it appears that deeper understanding of mechanisms is needed to overcome major problems including resistance to therapy, access to targets, complexity of biological systems and individual variations. PMID:22520975

  2. Multistage vector (MSV) therapeutics.

    PubMed

    Wolfram, Joy; Shen, Haifa; Ferrari, Mauro

    2015-12-10

    One of the greatest challenges in the field of medicine is obtaining controlled distribution of systemically administered therapeutic agents within the body. Indeed, biological barriers such as physical compartmentalization, pressure gradients, and excretion pathways adversely affect localized delivery of drugs to pathological tissue. The diverse nature of these barriers requires the use of multifunctional drug delivery vehicles that can overcome a wide range of sequential obstacles. In this review, we explore the role of multifunctionality in nanomedicine by primarily focusing on multistage vectors (MSVs). The MSV is an example of a promising therapeutic platform that incorporates several components, including a microparticle, nanoparticles, and small molecules. In particular, these components are activated in a sequential manner in order to successively address transport barriers. PMID:26264836

  3. Therapeutic Hypothermia for Neuroprotection

    PubMed Central

    Karnatovskaia, Lioudmila V.; Wartenberg, Katja E.

    2014-01-01

    The earliest recorded application of therapeutic hypothermia in medicine spans about 5000 years; however, its use has become widespread since 2002, following the demonstration of both safety and efficacy of regimens requiring only a mild (32°C-35°C) degree of cooling after cardiac arrest. We review the mechanisms by which hypothermia confers neuroprotection as well as its physiological effects by body system and its associated risks. With regard to clinical applications, we present evidence on the role of hypothermia in traumatic brain injury, intracranial pressure elevation, stroke, subarachnoid hemorrhage, spinal cord injury, hepatic encephalopathy, and neonatal peripartum encephalopathy. Based on the current knowledge and areas undergoing or in need of further exploration, we feel that therapeutic hypothermia holds promise in the treatment of patients with various forms of neurologic injury; however, additional quality studies are needed before its true role is fully known. PMID:24982721

  4. Strategies for therapeutic hypometabothermia

    PubMed Central

    Liu, Shimin; Chen, Jiang-Fan

    2013-01-01

    Although therapeutic hypothermia and metabolic suppression have shown robust neuroprotection in experimental brain ischemia, systemic complications have limited their use in treating acute stroke patients. The core temperature and basic metabolic rate are tightly regulated and maintained in a very stable level in mammals. Simply lowering body temperature or metabolic rate is actually a brutal therapy that may cause more systemic as well as regional problems other than providing protection. These problems are commonly seen in hypothermia and barbiturate coma. The main innovative concept of this review is to propose thermogenically optimal and synergistic reduction of core temperature and metabolic rate in therapeutic hypometabothermia using novel and clinically practical approaches. When metabolism and body temperature are reduced in a systematically synergistic manner, the outcome will be maximal protection and safe recovery, which happen in natural process, such as in hibernation, daily torpor and estivation. PMID:24179563

  5. Global orbit corrections

    SciTech Connect

    Symon, K.

    1987-11-01

    There are various reasons for preferring local (e.g., three bump) orbit correction methods to global corrections. One is the difficulty of solving the mN equations for the required mN correcting bumps, where N is the number of superperiods and m is the number of bumps per superperiod. The latter is not a valid reason for avoiding global corrections, since, we can take advantage of the superperiod symmetry to reduce the mN simultaneous equations to N separate problems, each involving only m simultaneous equations. Previously, I have shown how to solve the general problem when the machine contains unknown magnet errors of known probability distribution; we made measurements of known precision of the orbit displacements at a set of points, and we wish to apply correcting bumps to minimize the weighted rms orbit deviations. In this report, we will consider two simpler problems, using similar methods. We consider the case when we make M beam position measurements per superperiod, and we wish to apply an equal number M of orbit correcting bumps to reduce the measured position errors to zero. We also consider the problem when the number of correcting bumps is less than the number of measurements, and we wish to minimize the weighted rms position errors. We will see that the latter problem involves solving equations of a different form, but involving the same matrices as the former problem.

  6. Polycyclic peptide therapeutics.

    PubMed

    Baeriswyl, Vanessa; Heinis, Christian

    2013-03-01

    Owing to their excellent binding properties, high stability, and low off-target toxicity, polycyclic peptides are an attractive molecule format for the development of therapeutics. Currently, only a handful of polycyclic peptides are used in the clinic; examples include the antibiotic vancomycin, the anticancer drugs actinomycin D and romidepsin, and the analgesic agent ziconotide. All clinically used polycyclic peptide drugs are derived from natural sources, such as soil bacteria in the case of vancomycin, actinomycin D and romidepsin, or the venom of a fish-hunting coil snail in the case of ziconotide. Unfortunately, nature provides peptide macrocyclic ligands for only a small fraction of therapeutic targets. For the generation of ligands of targets of choice, researchers have inserted artificial binding sites into natural polycyclic peptide scaffolds, such as cystine knot proteins, using rational design or directed evolution approaches. More recently, large combinatorial libraries of genetically encoded bicyclic peptides have been generated de novo and screened by phage display. In this Minireview, the properties of existing polycyclic peptide drugs are discussed and related to their interesting molecular architectures. Furthermore, technologies that allow the development of unnatural polycyclic peptide ligands are discussed. Recent application of these technologies has generated promising results, suggesting that polycyclic peptide therapeutics could potentially be developed for a broad range of diseases. PMID:23355488

  7. Model-Based Reasoning in Humans Becomes Automatic with Training

    PubMed Central

    Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J.

    2015-01-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load—a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  8. 75 FR 68409 - Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ..., the Presidential Determination number should read ``2010-14'' (Presidential Sig.) [FR Doc. C1-2010... Migration Needs Resulting From Flooding In Pakistan Correction In Presidential document 2010-27673...

  9. Corrected Age for Preemies

    MedlinePlus

    ... Prenatal Baby Bathing & Skin Care Breastfeeding Crying & Colic Diapers & Clothing Feeding & Nutrition Preemie Sleep Teething & Tooth Care Toddler Preschool Gradeschool Teen Young Adult Healthy Children > Ages & Stages > Baby > Preemie > Corrected Age ...

  10. Correcting Hubble Vision.

    ERIC Educational Resources Information Center

    Shaw, John M.; Sheahen, Thomas P.

    1994-01-01

    Describes the theory behind the workings of the Hubble Space Telescope, the spherical aberration in the primary mirror that caused a reduction in image quality, and the corrective device that compensated for the error. (JRH)

  11. Adaptable DC offset correction

    NASA Technical Reports Server (NTRS)

    Golusky, John M. (Inventor); Muldoon, Kelly P. (Inventor)

    2009-01-01

    Methods and systems for adaptable DC offset correction are provided. An exemplary adaptable DC offset correction system evaluates an incoming baseband signal to determine an appropriate DC offset removal scheme; removes a DC offset from the incoming baseband signal based on the appropriate DC offset scheme in response to the evaluated incoming baseband signal; and outputs a reduced DC baseband signal in response to the DC offset removed from the incoming baseband signal.

  12. Model-based HSF using by target point control function

    NASA Astrophysics Data System (ADS)

    Kim, Seongjin; Do, Munhoe; An, Yongbae; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu

    2015-03-01

    As the technology node shrinks, ArF Immersion reaches the limitation of wafer patterning, furthermore weak point during the mask processing is generated easily. In order to make strong patterning result, the design house conducts lithography rule checking (LRC). Despite LRC processing, we found the weak point at the verification stage of optical proximity correction (OPC). It is called the hot spot point (HSP). In order to fix the HSP, many studies have been performed. One of the most general hot spot fixing (HSF) methods is that the modification bias which consists of "Line-Resizing" and "Space-Resizing". In addition to the general rule biasing method, resolution enhancement techniques (RET) which includes the inverse lithography technology (ILT) and model based assist feature (MBAF) have been adapted to remove the hot spot and to maximize the process window. If HSP is found during OPC verification stage, various HSF methods can be applied. However, HSF process added on regular OPC procedure makes OPC turn-around time (TAT) increased. In this paper, we introduce a new HSF method that is able to make OPC TAT shorter than the common HSF method. The new HSF method consists of two concepts. The first one is that OPC target point is controlled to fix HSP. Here, the target point should be moved to optimum position at where the edge placement error (EPE) can be 0 at critical points. Many parameters such as a model accuracy or an OPC recipe become the cause of larger EPE. The second one includes controlling of model offset error through target point adjustment. Figure 1 shows the case EPE is not 0. It means that the simulation contour was not targeted well after OPC process. On the other hand, Figure 2 shows the target point is moved -2.5nm by using target point control function. As a result, simulation contour is matched to the original layout. This function can be powerfully adapted to OPC procedure of memory and logic devices.

  13. Psychodynamic Perspective on Therapeutic Boundaries

    PubMed Central

    Bridges, Nancy A.

    1999-01-01

    Discussion of boundaries in therapeutic work most often focuses on boundary maintenance, risk management factors, and boundary violations. The psychodynamic meaning and clinical management of boundaries in therapeutic relationships remains a neglected area of discourse. Clinical vignettes will illustrate a psychodynamic, developmental-relational perspective using boundary dilemmas to deepen and advance the therapeutic process. This article contributes to the dialogue about the process of making meaning and constructing therapeutically useful and creative boundaries that further the psychotherapeutic process. PMID:10523432

  14. Frankincense--therapeutic properties.

    PubMed

    Al-Yasiry, Ali Ridha Mustafa; Kiczorowska, Bożena

    2016-01-01

    Recently, increasing interest in natural dietary and therapeutic preparations used as dietary supplements has been observed. One of them is frankincense. This traditional medicine of the East is believed to have anti-inflammatory, expectorant, antiseptic, and even anxiolytic and anti-neurotic effects. The present study aims to verify the reported therapeutic properties of Boswellia resin and describe its chemical composition based on available scientific studies. The main component of frankincense is oil (60%). It contains mono- (13%) and diterpenes (40%) as well as ethyl acetate (21.4%), octyl acetate (13.4%) and methylanisole (7.6%). The highest biological activity among terpenes is characteristic of 11-keto-ß-acetyl-beta-boswellic acid, acetyl-11-keto-ß-boswellic acid and acetyl-α-boswellic acid. Contemporary studies have shown that resin indeed has an analgesic, tranquilising and anti-bacterial effects. From the point of view of therapeutic properties, extracts from Boswellia serrata and Boswellia carterii are reported to be particularly useful. They reduce inflammatory conditions in the course of rheumatism by inhibiting leukocyte elastase and degrading glycosaminoglycans. Boswellia preparations inhibit 5-lipoxygenase and prevent the release of leukotrienes, thus having an anti-inflammatory effect in ulcerative colitis, irritable bowel syndrome, bronchitis and sinusitis. Inhalation and consumption of Boswellia olibanum reduces the risk of asthma. In addition, boswellic acids have an antiproliferative effect on tumours. They inhibit proliferation of tumour cells of the leukaemia and glioblastoma subset. They have an anti-tumour effect since they inhibit topoisomerase I and II-alpha and stimulate programmed cell death (apoptosis). PMID:27117114

  15. Cystic Fibrosis Therapeutics

    PubMed Central

    Ramsey, Bonnie W.

    2013-01-01

    A great deal of excitement and hope has followed the successful trials and US Food and Drug Administration approval of the drug ivacaftor (Kalydeco), the first therapy available that targets the underlying defect that causes cystic fibrosis (CF). Although this drug has currently demonstrated a clinical benefit for a small minority of the CF population, the developmental pathway established by ivacaftor paves the way for other CF transmembrane conductance regulator (CFTR) modulators that may benefit many more patients. In addition to investigating CFTR modulators, researchers are actively developing numerous other innovative CF therapies. In this review, we use the catalog of treatments currently under evaluation with the support of the Cystic Fibrosis Foundation, known as the Cystic Fibrosis Foundation Therapeutics Pipeline, as a platform to discuss the variety of candidate treatments for CF lung disease that promise to improve CF care. Many of these approaches target the individual components of the relentless cycle of airway obstruction, inflammation, and infection characteristic of lung disease in CF, whereas others are aimed directly at the gene defect, or the resulting dysfunctional protein, that instigates this cycle. We discuss how new findings from the laboratory have informed not only the development of novel therapeutics, but also the rationales for their use and the outcomes used to measure their effects. By reviewing the breadth of candidate treatments currently in development, as well as the recent progress in CF therapies reflected by the evolution of the therapeutics pipeline over the past few years, we hope to build upon the optimism and anticipation generated by the recent success of Kalydeco. PMID:23276843

  16. Telomerase and cancer therapeutics.

    PubMed

    Harley, Calvin B

    2008-03-01

    Telomerase is an attractive cancer target as it appears to be required in essentially all tumours for immortalization of a subset of cells, including cancer stem cells. Moreover, differences in telomerase expression, telomere length and cell kinetics between normal and tumour tissues suggest that targeting telomerase would be relatively safe. Clinical trials are ongoing with a potent and specific telomerase inhibitor, GRN163L, and with several versions of telomerase therapeutic vaccines. The prospect of adding telomerase-based therapies to the growing list of new anticancer products is promising, but what are the advantages and limitations of different approaches, and which patients are the most likely to respond? PMID:18256617

  17. [Achievement of therapeutic objectives].

    PubMed

    Mantilla, Teresa

    2014-07-01

    Therapeutic objectives for patients with atherogenic dyslipidemia are achieved by improving patient compliance and adherence. Clinical practice guidelines address the importance of treatment compliance for achieving objectives. The combination of a fixed dose of pravastatin and fenofibrate increases the adherence by simplifying the drug regimen and reducing the number of daily doses. The good tolerance, the cost of the combination and the possibility of adjusting the administration to the patient's lifestyle helps achieve the objectives for these patients with high cardiovascular risk. PMID:25043543

  18. Revitalizing Psychiatric Therapeutics

    PubMed Central

    Hyman, Steven E

    2014-01-01

    Despite high prevalence and enormous unmet medical need, the pharmaceutical industry has recently de-emphasized neuropsychiatric disorders as ‘too difficult' a challenge to warrant major investment. Here I describe major obstacles to drug discovery and development including a lack of new molecular targets, shortcomings of current animal models, and the lack of biomarkers for clinical trials. My major focus, however, is on new technologies and scientific approaches to neuropsychiatric disorders that give promise for revitalizing therapeutics and may thus answer industry's concerns. PMID:24317307

  19. [Is therapeutic deadlock inevitable?].

    PubMed

    Vignat, Jean-Pierre

    2016-01-01

    Many long-term treatments appear to be an expression of therapeutic deadlock. The situation leads to a questioning of the concept of chronicity and the identification of the determining factors of situations which are apparently blocked, marked by the search for solutions taking a back seat to the taking of action. The interaction between patients' mental apparatus and the care apparatus lies at the heart of the question, interpreted from an institutional, collective and individual perspective, supported by the clinical and psychopathological approach, and the return to the prioritisation of the thought. PMID:27389427

  20. The Therapeutic Roller Coaster

    PubMed Central

    CHU, JAMES A.

    1992-01-01

    Survivors of severe childhood abuse often encounter profound difficulties. In addition to posttraumatic and dissociative symptomatology, abuse survivors frequently have characterologic problems, particularly regarding self-care and maintaining relationships. Backgrounds of abuse, abandonment, and betrayal are often recapitulated and reenacted in therapy, making the therapeutic experience arduous and confusing for therapists and patients. Efforts must be directed at building an adequate psychotherapeutic foundation before undertaking exploration and abreaction of past traumatic experiences. This discussion sets out a model for treatment of childhood abuse survivors, describing stages of treatment and suggesting interventions. Common treatment dilemmas or "traps" are discussed, with recommendations for their resolution. PMID:22700116

  1. Therapeutic Endoscopic Ultrasound

    PubMed Central

    Cheriyan, Danny

    2015-01-01

    Endoscopic ultrasound (EUS) technology has evolved dramatically over the past 20 years, from being a supplementary diagnostic aid available only in large medical centers to being a core diagnostic and therapeutic tool that is widely available. Although formal recommendations and practice guidelines have not been developed, there are considerable data supporting the use of EUS for its technical accuracy in diagnosing pancreaticobiliary and gastrointestinal pathology. Endosonography is now routine practice not only for pathologic diagnosis and tumor staging but also for drainage of cystic lesions and celiac plexus neurolysis. In this article, we cover the use of EUS in biliary and pancreatic intervention, ablative therapy, enterostomy, and vascular intervention. PMID:27118942

  2. Distributed real-time model-based diagnosis

    NASA Technical Reports Server (NTRS)

    Barrett, A. C.; Chung, S. H.

    2003-01-01

    This paper presents an approach to onboard anomaly diagnosis that combines the simplicity and real-time guarantee of a rule-based diagnosis system with the specification ease and coverage guarantees of a model-based diagnosis system.

  3. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  4. Qualitative model-based diagnosis using possibility theory

    NASA Technical Reports Server (NTRS)

    Joslyn, Cliff

    1994-01-01

    The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.

  5. Epigenomes as therapeutic targets.

    PubMed

    Hamm, Christopher A; Costa, Fabricio F

    2015-07-01

    Epigenetics is a molecular phenomenon that pertains to heritable changes in gene expression that do not involve changes in the DNA sequence. Epigenetic modifications in a whole genome, known as the epigenome, play an essential role in the regulation of gene expression in both normal development and disease. Traditional epigenetic changes include DNA methylation and histone modifications. Recent evidence reveals that other players, such as non-coding RNAs, may have an epigenetic regulatory role. Aberrant epigenetic signaling is becoming to be known as a central component of human disease, and the reversible nature of the epigenetic modifications provides an exciting opportunity for the development of clinically relevant therapeutics. Current epigenetic therapies provide a clinical benefit through disrupting DNA methyltransferases or histone deacetylases. However, the emergence of next-generation epigenetic therapies provides an opportunity to more effectively disrupt epigenetic disease states. Novel epigenetic therapies may improve drug targeting and drug delivery, optimize dosing schedules, and improve the efficacy of preexisting treatment modalities (chemotherapy, radiation, and immunotherapy). This review discusses the epigenetic mechanisms that contribute to the disease, available epigenetic therapies, epigenetic therapies currently in development, and the potential future use of epigenetic therapeutics in a clinical setting. PMID:25797698

  6. AMUM LECTURE: Therapeutic ultrasound

    NASA Astrophysics Data System (ADS)

    Crum, Lawrence A.

    2004-01-01

    The use of ultrasound in medicine is now quite commonplace, especially with the recent introduction of small, portable and relatively inexpensive, hand-held diagnostic imaging devices. Moreover, ultrasound has expanded beyond the imaging realm, with methods and applications extending to novel therapeutic and surgical uses. These applications broadly include: tissue ablation, acoustocautery, lipoplasty, site-specific and ultrasound mediated drug activity, extracorporeal lithotripsy, and the enhancement of natural physiological functions such as wound healing and tissue regeneration. A particularly attractive aspect of this technology is that diagnostic and therapeutic systems can be combined to produce totally non-invasive, imageguided therapy. This general lecture will review a number of these exciting new applications of ultrasound and address some of the basic scientific questions and future challenges in developing these methods and technologies for general use in our society. We shall particularly emphasize the use of High Intensity Focused Ultrasound (HIFU) in the treatment of benign and malignant tumors as well as the introduction of acoustic hemostasis, especially in organs which are difficult to treat using conventional medical and surgical techniques.

  7. Winnicott's therapeutic consultations revisited.

    PubMed

    Brafman, A H

    1997-08-01

    Winnicott described in his book 'Therapeutic Consultations' (1971) how a diagnostic assessment of a referred child developed into a fruitful therapeutic intervention when he was able to discover the unconscious fantasy that underlay the child's symptoms. Because these were children who were, essentially, developing normally, he used the word 'knot' to depict the obstacle the child had met. Any conflicts the parents might have were not explored in that context. This work present cases in which child and parents are seen together for the diagnostic assessment. The child's feelings about his world and his difficulties are explored through a variety of techniques including drawings. In the same interview, an analytic enquiry into the parents' history and also their views of the child reveals how the child's fantasies and the parents' past experiences interact and create a mutually reinforcing vicious circle. In other words, the 'knot' involves all of them. If the child's unconscious fantasy can be verbalised and if the parents are able to approach the child in a manner that acknowledges the child's real needs, the 'knot' disappears and normal development can be resumed. PMID:9306188

  8. Engineering therapeutic protein disaggregases.

    PubMed

    Shorter, James

    2016-05-15

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), and Alzheimer's disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  9. Engineering therapeutic protein disaggregases

    PubMed Central

    Shorter, James

    2016-01-01

    Therapeutic agents are urgently required to cure several common and fatal neurodegenerative disorders caused by protein misfolding and aggregation, including amyotrophic lateral sclerosis (ALS), Parkinson’s disease (PD), and Alzheimer’s disease (AD). Protein disaggregases that reverse protein misfolding and restore proteins to native structure, function, and localization could mitigate neurodegeneration by simultaneously reversing 1) any toxic gain of function of the misfolded form and 2) any loss of function due to misfolding. Potentiated variants of Hsp104, a hexameric AAA+ ATPase and protein disaggregase from yeast, have been engineered to robustly disaggregate misfolded proteins connected with ALS (e.g., TDP-43 and FUS) and PD (e.g., α-synuclein). However, Hsp104 has no metazoan homologue. Metazoa possess protein disaggregase systems distinct from Hsp104, including Hsp110, Hsp70, and Hsp40, as well as HtrA1, which might be harnessed to reverse deleterious protein misfolding. Nevertheless, vicissitudes of aging, environment, or genetics conspire to negate these disaggregase systems in neurodegenerative disease. Thus, engineering potentiated human protein disaggregases or isolating small-molecule enhancers of their activity could yield transformative therapeutics for ALS, PD, and AD. PMID:27255695

  10. Pharmacogenetics approach to therapeutics.

    PubMed

    Koo, Seok Hwee; Lee, Edmund Jon Deoon

    2006-01-01

    1. Pharmacogenetics refers to the study of genetically controlled variations in drug response. Functional variants caused by single nucleotide polymorphisms (SNPs) in genes encoding drug-metabolising enzymes, transporters, ion channels and drug receptors have been known to be associated with interindividual and interethnic variation in drug response. Genetic variations in these genes play a role in influencing the efficacy and toxicity of medications. 2. Rapid, precise and cost-effective high-throughput technological platforms are essential for performing large-scale mutational analysis of genetic markers involved in the aetiology of variable responses to drug therapy. 3. The application of a pharmacogenetics approach to therapeutics in general clinical practice is still far from being achieved today owing to various constraints, such as limited accessibility of technology, inadequate knowledge, ambiguity of the role of variants and ethical concerns. 4. Drug actions are determined by the interplay of several genes encoding different proteins involved in various biochemical pathways. With rapidly emerging SNP discovery technological platforms and widespread knowledge on the role of SNPs in disease susceptibility and variability in drug response, the pharmacogenetics approach to therapeutics is anticipated to take off in the not-too-distant future. This will present profound clinical, economic and social implications for health care. PMID:16700889

  11. Person-centered Therapeutics

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    A clinician’s effectiveness in treatment depends substantially on his or her attitude toward -- and understanding of -- the patient as a person endowed with self-awareness and the will to direct his or her own future. The assessment of personality in the therapeutic encounter is a crucial foundation for forming an effective working alliance with shared goals. Helping a person to reflect on their personality provides a mirror image of their strengths and weaknesses in adapting to life’s many challenges. The Temperament and Character Inventory (TCI) provides an effective way to describe personality thoroughly and to predict both the positive and negative aspects of health. Strengths and weaknesses in TCI personality traits allow strong predictions of individual differences of all aspects of well-being. Diverse therapeutic techniques, such as diet, exercise, mood self-regulation, meditation, or acts of kindness, influence health and personality development in ways that are largely indistinguishable from one another or from effective allopathic treatments. Hence the development of well-being appears to be the result of activating a synergistic set of mechanisms of well-being, which are expressed as fuller functioning, plasticity, and virtue in adapting to life’s challenges PMID:26052429

  12. Mechanisms of Plasma Therapeutics

    NASA Astrophysics Data System (ADS)

    Graves, David

    2015-09-01

    In this talk, I address research directed towards biomedical applications of atmospheric pressure plasma such as sterilization, surgery, wound healing and anti-cancer therapy. The field has seen remarkable growth in the last 3-5 years, but the mechanisms responsible for the biomedical effects have remained mysterious. It is known that plasmas readily create reactive oxygen species (ROS) and reactive nitrogen species (RNS). ROS and RNS (or RONS), in addition to a suite of other radical and non-radical reactive species, are essential actors in an important sub-field of aerobic biology termed ``redox'' (or oxidation-reduction) biology. It is postulated that cold atmospheric plasma (CAP) can trigger a therapeutic shielding response in tissue in part by creating a time- and space-localized, burst-like form of oxy-nitrosative stress on near-surface exposed cells through the flux of plasma-generated RONS. RONS-exposed surface layers of cells communicate to the deeper levels of tissue via a form of the ``bystander effect,'' similar to responses to other forms of cell stress. In this proposed model of CAP therapeutics, the plasma stimulates a cellular survival mechanism through which aerobic organisms shield themselves from infection and other challenges.

  13. Reduced model-based decision-making in schizophrenia.

    PubMed

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record PMID:27175984

  14. Peteye detection and correction

    NASA Astrophysics Data System (ADS)

    Yen, Jonathan; Luo, Huitao; Tretter, Daniel

    2007-01-01

    Redeyes are caused by the camera flash light reflecting off the retina. Peteyes refer to similar artifacts in the eyes of other mammals caused by camera flash. In this paper we present a peteye removal algorithm for detecting and correcting peteye artifacts in digital images. Peteye removal for animals is significantly more difficult than redeye removal for humans, because peteyes can be any of a variety of colors, and human face detection cannot be used to localize the animal eyes. In many animals, including dogs and cats, the retina has a special reflective layer that can cause a variety of peteye colors, depending on the animal's breed, age, or fur color, etc. This makes the peteye correction more challenging. We have developed a semi-automatic algorithm for peteye removal that can detect peteyes based on the cursor position provided by the user and correct them by neutralizing the colors with glare reduction and glint retention.

  15. Aureolegraph internal scattering correction.

    PubMed

    DeVore, John; Villanucci, Dennis; LePage, Andrew

    2012-11-20

    Two methods of determining instrumental scattering for correcting aureolegraph measurements of particulate solar scattering are presented. One involves subtracting measurements made with and without an external occluding ball and the other is a modification of the Langley Plot method and involves extrapolating aureolegraph measurements collected through a large range of solar zenith angles. Examples of internal scattering correction determinations using the latter method show similar power-law dependencies on scattering, but vary by roughly a factor of 8 and suggest that changing aerosol conditions during the determinations render this method problematic. Examples of corrections of scattering profiles using the former method are presented for a range of atmospheric particulate layers from aerosols to cumulus and cirrus clouds. PMID:23207299

  16. Therapeutic Community in a California Prison: Treatment Outcomes after 5 Years

    ERIC Educational Resources Information Center

    Zhang, Sheldon X.; Roberts, Robert E. L.; McCollister, Kathryn E.

    2011-01-01

    Therapeutic communities have become increasingly popular among correctional agencies with drug-involved offenders. This quasi-experimental study followed a group of inmates who participated in a prison-based therapeutic community in a California state prison, with a comparison group of matched offenders, for more than 5 years after their initial…

  17. Hypoxic Conditioning as a New Therapeutic Modality

    PubMed Central

    Verges, Samuel; Chacaroun, Samarmar; Godin-Ribuot, Diane; Baillieul, Sébastien

    2015-01-01

    Preconditioning refers to a procedure by which a single noxious stimulus below the threshold of damage is applied to the tissue in order to increase resistance to the same or even different noxious stimuli given above the threshold of damage. Hypoxic preconditioning relies on complex and active defenses that organisms have developed to counter the adverse consequences of oxygen deprivation. The protection it confers against ischemic attack for instance as well as the underlying biological mechanisms have been extensively investigated in animal models. Based on these data, hypoxic conditioning (consisting in recurrent exposure to hypoxia) has been suggested a potential non-pharmacological therapeutic intervention to enhance some physiological functions in individuals in whom acute or chronic pathological events are anticipated or existing. In addition to healthy subjects, some benefits have been reported in patients with cardiovascular and pulmonary diseases as well as in overweight and obese individuals. Hypoxic conditioning consisting in sessions of intermittent exposure to moderate hypoxia repeated over several weeks may induce hematological, vascular, metabolic, and neurological effects. This review addresses the existing evidence regarding the use of hypoxic conditioning as a potential therapeutic modality, and emphasizes on many remaining issues to clarify and future researches to be performed in the field. PMID:26157787

  18. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Optical measurements of range and elevation angle are distorted by the earth's atmosphere. High precision refraction correction equations are presented which are ideally suited for surveying because their inputs are optically measured range and optically measured elevation angle. The outputs are true straight line range and true geometric elevation angle. The 'short distances' used in surveying allow the calculations of true range and true elevation angle to be quickly made using a programmable pocket calculator. Topics covered include the spherical form of Snell's Law; ray path equations; and integrating the equations. Short-, medium-, and long-range refraction corrections are presented in tables.

  19. Correction coil cable

    DOEpatents

    Wang, S.T.

    1994-11-01

    A wire cable assembly adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies for the Superconducting Super Collider. The correction coil cables have wires collected in wire array with a center rib sandwiched therebetween to form a core assembly. The core assembly is surrounded by an assembly housing having an inner spiral wrap and a counter wound outer spiral wrap. An alternate embodiment of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable on a particle tube in a particle tube assembly. 7 figs.

  20. Target Mass Corrections Revisited

    SciTech Connect

    W. Melnitchouk; F. Steffens

    2006-03-07

    We propose a new implementation of target mass corrections to nucleon structure functions which, unlike existing treatments, has the correct kinematic threshold behavior at finite Q{sup 2} in the x {yields} 1 limit. We illustrate the differences between the new approach and existing prescriptions by considering specific examples for the F{sub 2} and F{sub L} structure functions, and discuss the broader implications of our results, which call into question the notion of universal parton distribution at finite Q{sup 2}.

  1. Microfabricated therapeutic actuators

    SciTech Connect

    Lee, Abraham P.; Northrup, M. Allen; Ciarlo, Dino R.; Krulevitch, Peter A.; Benett, William J.

    1999-01-01

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use.

  2. Microfabricated therapeutic actuators

    DOEpatents

    Lee, A.P.; Northrup, M.A.; Ciarlo, D.R.; Krulevitch, P.A.; Benett, W.J.

    1999-06-15

    Microfabricated therapeutic actuators are fabricated using a shape memory polymer (SMP), a polyurethane-based material that undergoes a phase transformation at a specified temperature (Tg). At a temperature above temperature Tg material is soft and can be easily reshaped into another configuration. As the temperature is lowered below temperature Tg the new shape is fixed and locked in as long as the material stays below temperature Tg. Upon reheating the material to a temperature above Tg, the material will return to its original shape. By the use of such SMP material, SMP microtubing can be used as a release actuator for the delivery of embolic coils through catheters into aneurysms, for example. The microtubing can be manufactured in various sizes and the phase change temperature Tg is determinate for an intended temperature target and intended use. 8 figs.

  3. Aptamers in Therapeutics

    PubMed Central

    2016-01-01

    Aptamers are single strand DNA or RNA molecules, selected by an iterative process known as Systematic Evolution of Ligands by Exponential Enrichment (SELEX). Due to various advantages of aptamers such as high temperature stability, animal free, cost effective production and its high affinity and selectivity for its target make them attractive alternatives to monoclonal antibody for use in diagnostic and therapeutic purposes. Aptamer has been generated against vesicular endothelial growth factor 165 involved in age related macular degeneracy. Macugen was the first FDA approved aptamer based drug that was commercialized. Later other aptamers were also developed against blood clotting proteins, cancer proteins, antibody E, agents involved in diabetes nephropathy, autoantibodies involved in autoimmune disorders, etc. Aptamers have also been developed against viruses and could work with other antiviral agents in treating infections. PMID:27504277

  4. Antibody Engineering and Therapeutics

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Breden, Felix; Scott, Jamie K; Sok, Devin; Pauthner, Matthias; Reichert, Janice M; Helguera, Gustavo; Andrabi, Raiees; Mabry, Robert; Bléry, Mathieu; Voss, James E; Laurén, Juha; Abuqayyas, Lubna; Barghorn, Stefan; Ben-Jacob, Eshel; Crowe, James E; Huston, James S; Johnston, Stephen Albert; Krauland, Eric; Lund-Johansen, Fridtjof; Marasco, Wayne A; Parren, Paul WHI; Xu, Kai Y

    2014-01-01

    The 24th Antibody Engineering & Therapeutics meeting brought together a broad range of participants who were updated on the latest advances in antibody research and development. Organized by IBC Life Sciences, the gathering is the annual meeting of The Antibody Society, which serves as the scientific sponsor. Preconference workshops on 3D modeling and delineation of clonal lineages were featured, and the conference included sessions on a wide variety of topics relevant to researchers, including systems biology; antibody deep sequencing and repertoires; the effects of antibody gene variation and usage on antibody response; directed evolution; knowledge-based design; antibodies in a complex environment; polyreactive antibodies and polyspecificity; the interface between antibody therapy and cellular immunity in cancer; antibodies in cardiometabolic medicine; antibody pharmacokinetics, distribution and off-target toxicity; optimizing antibody formats for immunotherapy; polyclonals, oligoclonals and bispecifics; antibody discovery platforms; and antibody-drug conjugates. PMID:24589717

  5. Homocystinuria: Therapeutic approach.

    PubMed

    Kumar, Tarun; Sharma, Gurumayum Suraj; Singh, Laishram Rajendrakumar

    2016-07-01

    Homocystinuria is a disorder of sulfur metabolism pathway caused by deficiency of cystathionine β-synthase (CBS). It is characterized by increased accumulation of homocysteine (Hcy) in the cells and plasma. Increased homocysteine results in various vascular and neurological complications. Present strategies to lower cellular and plasma homocysteine levels include vitamin B6 intake, dietary methionine restriction, betaine supplementation, folate and vitamin B12 administration. However, these strategies are inefficient for treatment of homocystinuria. In recent years, advances have been made towards developing new strategies to treat homocystinuria. These mainly include functional restoration to mutant CBS, enhanced clearance of Hcy from the body, prevention of N-homocysteinylation-induced toxicity and inhibition of homocysteine-induced oxidative stress. In this review, we have exclusively discussed the recent advances that have been achieved towards the treatment of homocystinuria. The review is an attempt to help clinicians in developing effective therapeutic strategies and designing novel drugs against homocystinuria. PMID:27059523

  6. Mitochondrial Energetics and Therapeutics

    PubMed Central

    Wallace, Douglas C.; Fan, Weiwei; Procaccio, Vincent

    2011-01-01

    Mitochondrial dysfunction has been linked to a wide range of degenerative and metabolic diseases, cancer, and aging. All these clinical manifestations arise from the central role of bioenergetics in cell biology. Although genetic therapies are maturing as the rules of bioenergetic genetics are clarified, metabolic therapies have been ineffectual. This failure results from our limited appreciation of the role of bioenergetics as the interface between the environment and the cell. A systems approach, which, ironically, was first successfully applied over 80 years ago with the introduction of the ketogenic diet, is required. Analysis of the many ways that a shift from carbohydrate glycolytic metabolism to fatty acid and ketone oxidative metabolism may modulate metabolism, signal transduction pathways, and the epigenome gives us an appreciation of the ketogenic diet and the potential for bioenergetic therapeutics. PMID:20078222

  7. Principles of therapeutics.

    PubMed

    Miller, T R

    1992-12-01

    Topical administration of drugs is the treatment of choice for diseases of the anterior segment. Drug levels attained by this means are usually of short duration, however, necessitating frequent therapy or continuous perfusion if prolonged drug levels are required. A drug-delivery device (collagen shield or contact lens) or subconjunctival injections can be used to augment topical therapy if frequent treatment is not possible. Subconjunctival injections are recommended for drugs that have low solubility and, hence, low corneal penetration. Retrobulbar injections are seldom indicated, except for regional anesthesia. Systemic administration is useful for anti-inflammatory therapy but it may be difficult to establish therapeutic levels of antibiotic agents in the eye because of the blood-ocular barrier. In severe cases, intraocular injection may be required. PMID:1458325

  8. Antioxidant therapeutics: Pandora's box.

    PubMed

    Day, Brian J

    2014-01-01

    Evolution has favored the utilization of dioxygen (O2) in the development of complex multicellular organisms. O2 is actually a toxic mutagenic gas that is highly oxidizing and combustible. It is thought that plants are largely to blame for polluting the earth's atmosphere with O2 owing to the development of photosynthesis by blue-green algae over 2 billion years ago. The rise of the plants and atmospheric O2 levels placed evolutionary stress on organisms to adapt or become extinct. This implies that all the surviving creatures on our planet are mutants that have adapted to the "abnormal biology" of O2. Much of the adaptation to the presence of O2 in biological systems comes from well-coordinated antioxidant and repair systems that focus on converting O2 to its most reduced form, water (H2O), and the repair and replacement of damaged cellular macromolecules. Biological systems have also harnessed O2's reactive properties for energy production, xenobiotic metabolism, and host defense and as a signaling messenger and redox modulator of a number of cell signaling pathways. Many of these systems involve electron transport systems and offer many different mechanisms by which antioxidant therapeutics can alternatively produce an antioxidant effect without directly scavenging oxygen-derived reactive species. It is likely that each agent will have a different set of mechanisms that may change depending on the model of oxidative stress, organ system, or disease state. An important point is that all biological processes of aerobes have coevolved with O2 and this creates a Pandora's box for trying to understand the mechanism(s) of action of antioxidants being developed as therapeutic agents. PMID:23856377

  9. GTI-2040. Lorus Therapeutics.

    PubMed

    Orr, R M

    2001-10-01

    Loris Therapeutics (formerly GeneSense Therapeutics) is developing the antisense oligonucleotide GTI-2040, directed against the R2 component of ribonucleotide reductase, for the potential treatment of cancer [348194]. It is in phase I/II trials [353796] and Lorus had anticipated phase II trials would be initiated in July 2001. By August 2001, GTI-2040 was undergoing a phase II trial as a monotherapy for the potential treatment of renal cell carcinoma, and was about to enter a phase II combination study for this indication with capecitabine (Hoffmann-La Roche). At this time, the company was also planning a phase II trial to study the drug's potential in the treatment of colorectal cancer [418739]. GTI-2040 has been tested in nine different tumor models, including tumors derived from colon, liver, lung, breast, kidney and ovary. Depending on the tumor model, significant inhibition of tumor growth, disease stabilization and dramatic tumor regressions was observed [347683]. Lorus filed an IND to commence phase I/II trials with GTI-2040 in the US in November 1999 [347683], and received approval for the trials in December 1999 [349623]. As of January 2000, these trials had commenced at the University of Chicago Cancer Research Center; it was reported in February 2000 that dosing to date had been well tolerated with no apparent safety concerns [357449]. Lorus has entered into a strategic supply alliance with Proligo to provide the higher volumes of drug product required for the planned multiple phase II trials [385976]. In February 1998, Genesense (now Lorus) received patent WO-09805769. Loris also received a patent (subsequently identified as WO-00047733) from the USPTO in January 2000, entitled 'Antitumor antisense sequences directed against components of ribonucleotide reductase' covering the design and use of unique antisense anticancer drugs, including GTI-2040 and GTI-2501 [353538]. PMID:11890366

  10. Atmospheric Correction Algorithm for Hyperspectral Imagery

    SciTech Connect

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolute calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.

  11. The Digital Correction Unit: A data correction/compaction chip

    SciTech Connect

    MacKenzie, S.; Nielsen, B.; Paffrath, L.; Russell, J.; Sherden, D.

    1986-10-01

    The Digital Correction Unit (DCU) is a semi-custom CMOS integrated circuit which corrects and compacts data for the SLD experiment. It performs a piece-wise linear correction to data, and implements two separate compaction algorithms. This paper describes the basic functionality of the DCU and its correction and compaction algorithms.

  12. Refraction corrections for surveying

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1980-01-01

    Optical measurements of range and elevation angles are distorted by refraction of Earth's atmosphere. Theoretical discussion of effect, along with equations for determining exact range and elevation corrections, is presented in report. Potentially useful in optical site surveying and related applications, analysis is easily programmed on pocket calculator. Input to equation is measured range and measured elevation; output is true range and true elevation.

  13. Correction and Communicative Activity.

    ERIC Educational Resources Information Center

    Williams, Huw P.

    1980-01-01

    In classes where the communicative approach to language teaching is taken and where learners are asked to form groups in order to communicate, the teacher should be ready to respond to requests, give immediate correction, and use a monitoring sheet to note errors. The sheet can also be used for individual students. (PJM)

  14. Writing: Revisions and Corrections

    ERIC Educational Resources Information Center

    Kohl, Herb

    1978-01-01

    A fifth grader wanted to know what he had to do to get all his ideas the way he wanted them in his story writing "and" have the spelling, punctuation and quotation marks correctly styled. His teacher encouraged him to think about writing as a process and provided the student with three steps as guidelines for effective writing. (Author/RK)

  15. Counselor Education for Corrections.

    ERIC Educational Resources Information Center

    Parsigian, Linda

    Counselor education programs most often prepare their graduates to work in either a school setting, anywhere from the elementary level through higher education, or a community agency. There is little indication that counselor education programs have seriously undertaken the task of training counselors to enter the correctional field. If…

  16. Exposure Corrections for Macrophotography

    ERIC Educational Resources Information Center

    Nikolic, N. M.

    1976-01-01

    Describes a method for determining the exposure correction factors in close-up photography and macrophotography. The method eliminates all calculations during picture-taking, and allows the use of a light meter to obtain the proper f-stop/exposure time combinations. (Author/MLH)

  17. Passive localization in ocean acoustics: A model-based approach

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1995-09-01

    A model-based approach is developed to solve the passive localization problem in ocean acoustics using the state-space formulation for the first time. It is shown that the inherent structure of the resulting processor consists of a parameter estimator coupled to a nonlinear optimization scheme. The parameter estimator is designed using the model-based approach in which an ocean acoustic propagation model is used in developing the model-based processor required for localization. Recall that model-based signal processing is a well-defined methodology enabling the inclusion of environmental (propagation) models, measurement (sensor arrays) models, and noise (shipping, measurement) models into a sophisticated processing algorithm. Here the parameter estimator is designed, or more appropriately the model-based identifier (MBID) for a propagation model developed from a shallow water ocean experiment. After simulation, it is then applied to a set of experimental data demonstrating the applicability of this approach. {copyright} {ital 1995} {ital Acoustical} {ital Society} {ital of} {ital America}.

  18. Model-based ocean acoustic passive localization. Revision 1

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-06-01

    A model-based approach is developed (theoretically) to solve the passive localization problem. Here the authors investigate the design of a model-based identifier for a shallow water ocean acoustic problem characterized by a normal-mode model. In this problem they show how the processor can be structured to estimate the vertical wave numbers directly from measured pressure-field and sound speed measurements thereby eliminating the need for synthetic aperture processing or even a propagation model solution. Finally, they investigate various special cases of the source localization problem, designing a model-based localizer for each and evaluating the underlying structure with the expectation of gaining more and more insight into the general problem.

  19. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  20. Management of antipsychotic treatment discontinuation and interruptions using model-based simulations

    PubMed Central

    Samtani, Mahesh N; Sheehan, John J; Fu, Dong-Jing; Remmerie, Bart; Sliwa, Jennifer Kern; Alphs, Larry

    2012-01-01

    Background Medication nonadherence is a well described and prevalent clinical occurrence in schizophrenia. These pharmacokinetic model-based simulations analyze predicted antipsychotic plasma concentrations in nonadherence and treatment interruption scenarios and with treatment reinitiation. Methods Starting from steady state, pharmacokinetic model-based simulations of active moiety plasma concentrations of oral, immediate-release risperidone 3 mg/day, risperidone long-acting injection 37.5 mg/14 days, oral paliperidone extended-release 6 mg/day, and paliperidone palmitate 117 mg (75 mg equivalents)/28 days were assessed under three treatment discontinuation/interruption scenarios, ie, complete discontinuation, one week of interruption, and four weeks of interruption. In the treatment interruption scenarios, pharmacokinetic simulations were performed using medication-specific reinitiation strategies. Results Following complete treatment discontinuation, plasma concentrations persisted longest with paliperidone palmitate, followed by risperidone long-acting injection, while oral formulations exhibited the most rapid decrease. One week of oral paliperidone or risperidone interruption resulted in near complete elimination from the systemic circulation within that timeframe, reflecting the rapid elimination rate of the active moiety. After 1 and 4 weeks of interruption, minimum plasma concentrations were higher with paliperidone palmitate than risperidone long-acting injection over the simulated period. Four weeks of treatment interruption followed by reinitiation resulted in plasma levels returning to predicted therapeutic levels within 1 week. Conclusion Due to the long half-life of paliperidone palmitate (25–49 days), putative therapeutic plasma concentrations persisted longest in simulated cases of complete discontinuation or treatment interruption. These simulations may help clinicians better conceptualize the impact of antipsychotic nonadherence on plasma

  1. When Does Model-Based Control Pay Off?

    PubMed

    Kool, Wouter; Cushman, Fiery A; Gershman, Samuel J

    2016-08-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to "model-free" and "model-based" strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  2. Clinical applications of therapeutic phlebotomy

    PubMed Central

    Kim, Kyung Hee; Oh, Ki Young

    2016-01-01

    Phlebotomy is the removal of blood from the body, and therapeutic phlebotomy is the preferred treatment for blood disorders in which the removal of red blood cells or serum iron is the most efficient method for managing the symptoms and complications. Therapeutic phlebotomy is currently indicated for the treatment of hemochromatosis, polycythemia vera, porphyria cutanea tarda, sickle cell disease, and nonalcoholic fatty liver disease with hyperferritinemia. This review discusses therapeutic phlebotomy and the related disorders and also offers guidelines for establishing a therapeutic phlebotomy program. PMID:27486346

  3. Therapeutic proteins: A to Z.

    PubMed

    Ozgur, Aykut; Tutar, Yusuf

    2013-12-01

    In recent years, therapeutic proteins have become an important growing class of drugs in the pharmaceutics industry. The development of recombinant DNA technology has caused to appreciation of therapeutic value of many proteins and peptides in medicine. Currently, approximately 100 therapeutic proteins obtained approval from Food and Drug Administration (FDA) and they are widely used in the treatment of various diseases such as cancer, diabetes, anemia and infections. This paper will summarize the production processes, pharmaceuticals and physicochemical properties and important classes of therapeutic proteins with their potential use in clinical applications. PMID:24261980

  4. Therapeutic cloning: The ethical limits

    SciTech Connect

    Whittaker, Peter A. . E-mail: p.whittaker@lancaster.ac.uk

    2005-09-01

    A brief outline of stem cells, stem cell therapy and therapeutic cloning is given. The position of therapeutic cloning with regard to other embryonic manipulations - IVF-based reproduction, embryonic stem formation from IVF embryos and reproductive cloning - is indicated. The main ethically challenging stages in therapeutic cloning are considered to be the nuclear transfer process including the source of eggs for this and the destruction of an embryo to provide stem cells for therapeutic use. The extremely polarised nature of the debate regarding the status of an early human embryo is noted, and some potential alternative strategies for preparing immunocompatible pluripotent stem cells are indicated.

  5. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  6. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  7. Verification and Validation of Model-Based Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  8. A Model Based Mars Climate Database for the Mission Design

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A viewgraph presentation on a model based climate database is shown. The topics include: 1) Why a model based climate database?; 2) Mars Climate Database v3.1 Who uses it ? (approx. 60 users!); 3) The new Mars Climate database MCD v4.0; 4) MCD v4.0: what's new ? 5) Simulation of Water ice clouds; 6) Simulation of Water ice cycle; 7) A new tool for surface pressure prediction; 8) Acces to the database MCD 4.0; 9) How to access the database; and 10) New web access

  9. [Hypercholesterolemia: a therapeutic approach].

    PubMed

    Moráis López, A; Lama More, R A; Dalmau Serra, J

    2009-05-01

    High blood cholesterol levels represent an important cardiovascular risk factor. Hypercholesterolemia is defined as levels of total cholesterol and low-density lipoprotein cholesterol above 95th percentile for age and gender. For the paediatric population, selective screening is recommended in children older than 2 years who are overweight, with a family history of early cardiovascular disease or whose parents have high cholesterol levels. Initial therapeutic approach includes diet therapy, appropriate physical activity and healthy lifestyle changes. Drug treatment should be considered in children from the age of 10 who, after having followed appropriate diet recommendations, still have very high LDL-cholesterol levels or moderately high levels with concomitant risk factors. In case of extremely high LDL-cholesterol levels, drug treatment should be taken into consideration at earlier ages (8 years old). Modest response is usually observed with bile acid-binding resins. Statins can be considered first-choice drugs, once evidence on their efficacy and safety has been shown. PMID:19427823

  10. [Liver metastasis: therapeutic strategy].

    PubMed

    Gennari, L; Doci, R; Bignami, P

    1996-01-01

    The liver is one of the most frequent sites of metastatic growth, in particular from digestive malignancies (DM). The first goal is to reduce the incidence of metastases. Adjuvant systemic chemotherapies have been demonstrated to reduce the recurrence rate and to improve survival in Dukes C colon cancer. Fluorouracil is the pivot of adjuvant treatment modulated by Leucovorin or Levamisol. A short postoperative administration of fluorouracil by intraportal route has been tested, but the results are controversial. Adjuvant treatments for different DM are under investigation. When hepatic metastases are clinically evident, therapeutic decisions depend on several factors: site and nature of primary, extent of hepatic and extrahepatic disease, patient characteristics, efficacy of treatments. A staging system should be adopted to allow a rational approach. In selected cases a locoregional treatment can achieve consistent results. Hepatic Intrarterial Chemotherapy (HIAC) for colorectal metastases achieves objective responses in more than 50% of patients. Survival seems positively affected. When feasible, Ro hepatic resection is the most effective treatment, five-year survival rate being about 30% when metastases are from colorectal cancer. Since the liver is the most frequent site of recurrence after resection, repeat resection have been successfully performed. PMID:9214269

  11. Plasmids encoding therapeutic agents

    DOEpatents

    Keener, William K.

    2007-08-07

    Plasmids encoding anti-HIV and anti-anthrax therapeutic agents are disclosed. Plasmid pWKK-500 encodes a fusion protein containing DP178 as a targeting moiety, the ricin A chain, an HIV protease cleavable linker, and a truncated ricin B chain. N-terminal extensions of the fusion protein include the maltose binding protein and a Factor Xa protease site. C-terminal extensions include a hydrophobic linker, an L domain motif peptide, a KDEL ER retention signal, another Factor Xa protease site, an out-of-frame buforin II coding sequence, the lacZ.alpha. peptide, and a polyhistidine tag. More than twenty derivatives of plasmid pWKK-500 are described. Plasmids pWKK-700 and pWKK-800 are similar to pWKK-500 wherein the DP178-encoding sequence is substituted by RANTES- and SDF-1-encoding sequences, respectively. Plasmid pWKK-900 is similar to pWKK-500 wherein the HIV protease cleavable linker is substituted by a lethal factor (LF) peptide-cleavable linker.

  12. Leech Therapeutic Applications

    PubMed Central

    Abdualkader, A. M.; Ghawi, A. M.; Alaama, M.; Awang, M.; Merzouk, A.

    2013-01-01

    Hematophagous animals including leeches have been known to possess biologically active compounds in their secretions, especially in their saliva. The blood-sucking annelids, leeches have been used for therapeutic purposes since the beginning of civilization. Ancient Egyptian, Indian, Greek and Arab physicians used leeches for a wide range of diseases starting from the conventional use for bleeding to systemic ailments, such as skin diseases, nervous system abnormalities, urinary and reproductive system problems, inflammation, and dental problems. Recently, extensive researches on leech saliva unveiled the presence of a variety of bioactive peptides and proteins involving antithrombin (hirudin, bufrudin), antiplatelet (calin, saratin), factor Xa inhibitors (lefaxin), antibacterial (theromacin, theromyzin) and others. Consequently, leech has made a comeback as a new remedy for many chronic and life-threatening abnormalities, such as cardiovascular problems, cancer, metastasis, and infectious diseases. In the 20th century, leech therapy has established itself in plastic and microsurgery as a protective tool against venous congestion and served to salvage the replanted digits and flaps. Many clinics for plastic surgery all over the world started to use leeches for cosmetic purposes. Despite the efficacious properties of leech therapy, the safety, and complications of leeching are still controversial. PMID:24019559

  13. Therapeutic Cancer Vaccines.

    PubMed

    Ye, Zhenlong; Li, Zhong; Jin, Huajun; Qian, Qijun

    2016-01-01

    Cancer is one of the major leading death causes of diseases. Prevention and treatment of cancer is an important way to decrease the incidence of tumorigenesis and prolong patients' lives. Subversive achievements on cancer immunotherapy have recently been paid much attention after many failures in basic and clinical researches. Based on deep analysis of genomics and proteomics of tumor antigens, a variety of cancer vaccines targeting tumor antigens have been tested in preclinical and human clinical trials. Many therapeutic cancer vaccines alone or combination with other conventional treatments for cancer obtained spectacular efficacy, indicating the tremendously potential application in clinic. With the illustration of underlying mechanisms of cancer immune regulation, valid, controllable, and persistent cancer vaccines will play important roles in cancer treatment, survival extension and relapse and cancer prevention. This chapter mainly summarizes the recent progresses and developments on cancer vaccine research and clinical application, thus exploring the existing obstacles in cancer vaccine research and promoting the efficacy of cancer vaccine. PMID:27240458

  14. Experimental Therapeutics for Dystonia

    PubMed Central

    Jinnah, H. A.; Hess, Ellen J.

    2008-01-01

    Dystonia is a neurological syndrome characterized by excessive involuntary muscle contractions leading to twisting movements and unnatural postures. It has many different clinical manifestations, and many different causes. More than 3 million people worldwide suffer from dystonia, yet there are few broadly effective treatments. In the past decade, progress in research has advanced our understanding of the pathogenesis of dystonia to a point where drug discovery efforts are now feasible. There are several strategies that can be used to develop novel therapeutics for dystonia. Existing therapies have only modest efficacy, but may be refined and improved to increase benefits while reducing side effects. Identifying rational targets for drug intervention based on the pathogenesis of dystonia is another strategy. The surge in both basic and clinical research discoveries has provided insights at all levels including etiological, physiological and nosological, to enable such a targeted approach. The empirical approach to drug discovery is complementary to the rational approach whereby compounds are identified using a non-mechanistic strategy. [MD1] With the recent development of multiple animal models of dystonia, it is now possible to develop assays and perform drug screens on vast number of compounds. This multifaceted approach to drug discovery in dystonia will likely provide lead compounds that can then be translated for clinical use. PMID:18394563

  15. Paediatric models in motion: requirements for model-based decision support at the bedside.

    PubMed

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  16. Paediatric models in motion: requirements for model-based decision support at the bedside

    PubMed Central

    Barrett, Jeffrey S

    2015-01-01

    Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868

  17. Correction coil cable

    DOEpatents

    Wang, Sou-Tien

    1994-11-01

    A wire cable assembly (10, 310) adapted for the winding of electrical coils is taught. A primary intended use is for use in particle tube assemblies (532) for the superconducting super collider. The correction coil cables (10, 310) have wires (14, 314) collected in wire arrays (12, 312) with a center rib (16, 316) sandwiched therebetween to form a core assembly (18, 318 ). The core assembly (18, 318) is surrounded by an assembly housing (20, 320) having an inner spiral wrap (22, 322) and a counter wound outer spiral wrap (24, 324). An alternate embodiment (410) of the invention is rolled into a keystoned shape to improve radial alignment of the correction coil cable (410) on a particle tube (733) in a particle tube assembly (732).

  18. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  19. OPC modeling and correction solutions for EUV lithography

    NASA Astrophysics Data System (ADS)

    Word, James; Zuniga, Christian; Lam, Michael; Habib, Mohamed; Adam, Kostas; Oliver, Michael

    2011-11-01

    The introduction of EUV lithography into the semiconductor fabrication process will enable a continuation of Moore's law below the 22nm technology node. EUV lithography will, however, introduce new sources of patterning distortions which must be accurately modeled and corrected with software. Flare caused by scattered light in the projection optics result in pattern density-dependent imaging errors. The combination of non-telecentric reflective optics with reflective reticles results in mask shadowing effects. Reticle absorber materials are likely to have non-zero reflectivity due to a need to balance absorber stack height with minimization of mask shadowing effects. Depending upon placement of adjacent fields on the wafer, reflectivity along their border can result in inter-field imaging effects near the edge of neighboring exposure fields. Finally, there exists the ever-present optical proximity effects caused by diffractionlimited imaging and resist and etch process effects. To enable EUV lithography in production, it is expected that OPC will be called-upon to compensate for most of these effects. With the anticipated small imaging error budgets at sub-22nm nodes it is highly likely that only full model-based OPC solutions will have the required accuracy. The authors will explore the current capabilities of model-based OPC software to model and correct for each of the EUV imaging effects. Modeling, simulation, and correction methodologies will be defined, and experimental results of a full model-based OPC flow for EUV lithography will be presented.

  20. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  1. Correction and updating.

    PubMed

    1994-03-01

    In the heading of David Cassidy's review of The Private Lives of Albert Einstein (18 February, p. 997) the price of the book as sold by its British publisher, Faber and Faber, was given incorrectly; the correct price is pound15.99. The book is also to be published in the United States by St. Martin's Press, New York, in April, at a price of $23.95. PMID:17817438

  2. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  3. Therapeutics in Huntington's Disease.

    PubMed

    Killoran, Annie; Biglan, Kevin M

    2012-02-01

    OPINION STATEMENT: There is no specific treatment for Huntington's disease (HD). Its many symptoms of motor, psychiatric, and cognitive deterioration are managed with symptomatic relief, rehabilitation, and support. The only drug approved by the US Food and Drug Administration (FDA) for the treatment of HD is an antichoreic agent, tetrabenazine, but this drug is used sparingly because of uneasiness regarding its propensity to cause depression and suicidality in this population, which is already at risk for these complications. Neuroleptics are still first-line treatments for chorea accompanied by comorbid depression and/or behavioral or psychotic symptoms, as is often the case. Psychiatric features, which have a significant impact on a patient's professional and personal life, often become the major focus of management. In addition to neuroleptics, commonly used medications include antidepressants, mood stabilizers, anxiolytics, and psychostimulants. In contrast, few treatment options are available for cognitive impairment in HD; this remains an important and largely unmet therapeutic need. HD patients typically lack insight into their disease manifestations, failing to recognize their need for treatment, and possibly even arguing against it. Multipurpose medications are employed advantageously to simplify the medication regimen, so as to facilitate compliance and not overwhelm the patient. For example, haloperidol can be prescribed for a patient with chorea, agitation, and anorexia, rather than targeting each symptom with a different drug. This approach also limits the potential for adverse effects, which can be difficult to distinguish from the features of the disease itself. With HD's complexity, it is best managed with a multidisciplinary approach that includes a movement disorders specialist, a genetic counselor, a mental health professional, a physical therapist, and a social worker for support and coordination of services. As the disease progresses, there

  4. Therapeutic Devices for Epilepsy

    PubMed Central

    Fisher, Robert S.

    2011-01-01

    Therapeutic devices provide new options for treating drug-resistant epilepsy. These devices act by a variety of mechanisms to modulate neuronal activity. Only vagus nerve stimulation, which continues to develop new technology, is approved for use in the United States. Deep brain stimulation (DBS) of anterior thalamus for partial epilepsy recently was approved in Europe and several other countries. Responsive neurostimulation, which delivers stimuli to one or two seizure foci in response to a detected seizure, recently completed a successful multicenter trial. Several other trials of brain stimulation are in planning or underway. Transcutaneous magnetic stimulation (TMS) may provide a noninvasive method to stimulate cortex. Controlled studies of TMS split on efficacy, and may depend on whether a seizure focus is near a possible region for stimulation. Seizure detection devices in the form of “shake” detectors via portable accelerometers can provide notification of an ongoing tonic-clonic seizure, or peace of mind in the absence of notification. Prediction of seizures from various aspects of EEG is in early stages. Prediction appears to be possible in a subpopulation of people with refractory seizures and a clinical trial of an implantable prediction device is underway. Cooling of neocortex or hippocampus reversibly can attenuate epileptiform EEG activity and seizures, but engineering problems remain in its implementation. Optogenetics is a new technique that can control excitability of specific populations of neurons with light. Inhibition of epileptiform activity has been demonstrated in hippocampal slices, but use in humans will require more work. In general, devices provide useful palliation for otherwise uncontrollable seizures, but with a different risk profile than with most drugs. Optimizing the place of devices in therapy for epilepsy will require further development and clinical experience. PMID:22367987

  5. Problem Solving: Physics Modeling-Based Interactive Engagement

    ERIC Educational Resources Information Center

    Ornek, Funda

    2009-01-01

    The purpose of this study was to investigate how modeling-based instruction combined with an interactive-engagement teaching approach promotes students' problem solving abilities. I focused on students in a calculus-based introductory physics course, based on the matter and interactions curriculum of Chabay & Sherwood (2002) at a large state…

  6. Cognitive control predicts use of model-based reinforcement learning.

    PubMed

    Otto, A Ross; Skatova, Anya; Madlon-Kay, Seth; Daw, Nathaniel D

    2015-02-01

    Accounts of decision-making and its neural substrates have long posited the operation of separate, competing valuation systems in the control of choice behavior. Recent theoretical and experimental work suggest that this classic distinction between behaviorally and neurally dissociable systems for habitual and goal-directed (or more generally, automatic and controlled) choice may arise from two computational strategies for reinforcement learning (RL), called model-free and model-based RL, but the cognitive or computational processes by which one system may dominate over the other in the control of behavior is a matter of ongoing investigation. To elucidate this question, we leverage the theoretical framework of cognitive control, demonstrating that individual differences in utilization of goal-related contextual information--in the service of overcoming habitual, stimulus-driven responses--in established cognitive control paradigms predict model-based behavior in a separate, sequential choice task. The behavioral correspondence between cognitive control and model-based RL compellingly suggests that a common set of processes may underpin the two behaviors. In particular, computational mechanisms originally proposed to underlie controlled behavior may be applicable to understanding the interactions between model-based and model-free choice behavior. PMID:25170791

  7. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    SciTech Connect

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  8. Impact of Model-Based Teaching on Argumentation Skills

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral; Belek, Deniz Eren

    2014-01-01

    The purpose of this study was to examine effects of model-based teaching on students' argumentation skills. Experimental design guided to the research. The participants of the study were pre-service physics teachers. The argumentative intervention lasted seven weeks. Data for this research were collected via video recordings and written…

  9. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  10. Educational Value and Models-Based Practice in Physical Education

    ERIC Educational Resources Information Center

    Kirk, David

    2013-01-01

    A models-based approach has been advocated as a means of overcoming the serious limitations of the traditional approach to physical education. One of the difficulties with this approach is that physical educators have sought to use it to achieve diverse and sometimes competing educational benefits, and these wide-ranging aspirations are rarely if…

  11. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    SciTech Connect

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-15

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated.

  12. Model-based diagnostics for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.

    1991-01-01

    An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.

  13. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil

    2012-01-01

    Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.

  14. Model-based near-wall reconstructions for immersed-boundary methods

    NASA Astrophysics Data System (ADS)

    Posa, Antonio; Balaras, Elias

    2014-08-01

    In immersed-boundary methods, the cost of resolving the thin boundary layers on a solid boundary at high Reynolds numbers is prohibitive. In the present work, we propose a new model-based, near-wall reconstruction to account for the lack of resolution and provide the correct wall shear stress and hydrodynamic forces. The models are analytical versions of a generalized version of the two-layer model developed by Balaras et al. (AIAA J 34:1111-1119, 1996) for large-eddy simulations. We will present the results for the flow around a cylinder and a sphere, where we use Cartesian and cylindrical coordinate grids. We will demonstrate that the proposed treatment reproduces very accurately the wall stress on grids, which are one order of magnitude coarser compared to well-resolved simulations.

  15. Duration Model-Based Post-processing for the Performance Improvement of a Keyword Spotting System

    NASA Astrophysics Data System (ADS)

    Lee, Min Ji; Yoon, Jae Sam; Oh, Yoo Rhee; Kim, Hong Kook; Choi, Song Ha; Kim, Ji Woon; Kim, Myeong Bo

    In this paper, we propose a post-processing method based on a duration model to improve the performance of a keyword spotting system. The proposed duration model-based post-processing method is performed after detecting a keyword. To detect the keyword, we first combine a keyword model, a non-keyword model, and a silence model. Using the information on the detected keyword, the proposed post-processing method is then applied to determine whether or not the correct keyword is detected. To this end, we generate the duration model using Gaussian distribution in order to accommodate different duration characteristics of each phoneme. Comparing the performance of the proposed method with those of conventional anti-keyword scoring methods, it is shown that the false acceptance and the false rejection rates are reduced.

  16. Bacteriophage Procurement for Therapeutic Purposes.

    PubMed

    Weber-Dąbrowska, Beata; Jończyk-Matysiak, Ewa; Żaczek, Maciej; Łobocka, Małgorzata; Łusiak-Szelachowska, Marzanna; Górski, Andrzej

    2016-01-01

    Bacteriophages (phages), discovered 100 years ago, are able to infect and destroy only bacterial cells. In the current crisis of antibiotic efficacy, phage therapy is considered as a supplementary or even alternative therapeutic approach. Evolution of multidrug-resistant and pandrug-resistant bacterial strains poses a real threat, so it is extremely important to have the possibility to isolate new phages for therapeutic purposes. Our phage laboratory and therapy center has extensive experience with phage isolation, characterization, and therapeutic application. In this article we present current progress in bacteriophages isolation and use for therapeutic purposes, our experience in this field and its practical implications for phage therapy. We attempt to summarize the state of the art: properties of phages, the methods for their isolation, criteria of phage selection for therapeutic purposes and limitations of their use. Perspectives for the use of genetically engineered phages to specifically target bacterial virulence-associated genes are also briefly presented. PMID:27570518

  17. Bacteriophage Procurement for Therapeutic Purposes

    PubMed Central

    Weber-Dąbrowska, Beata; Jończyk-Matysiak, Ewa; Żaczek, Maciej; Łobocka, Małgorzata; Łusiak-Szelachowska, Marzanna; Górski, Andrzej

    2016-01-01

    Bacteriophages (phages), discovered 100 years ago, are able to infect and destroy only bacterial cells. In the current crisis of antibiotic efficacy, phage therapy is considered as a supplementary or even alternative therapeutic approach. Evolution of multidrug-resistant and pandrug-resistant bacterial strains poses a real threat, so it is extremely important to have the possibility to isolate new phages for therapeutic purposes. Our phage laboratory and therapy center has extensive experience with phage isolation, characterization, and therapeutic application. In this article we present current progress in bacteriophages isolation and use for therapeutic purposes, our experience in this field and its practical implications for phage therapy. We attempt to summarize the state of the art: properties of phages, the methods for their isolation, criteria of phage selection for therapeutic purposes and limitations of their use. Perspectives for the use of genetically engineered phages to specifically target bacterial virulence-associated genes are also briefly presented. PMID:27570518

  18. Transdermal delivery of therapeutic agent

    NASA Technical Reports Server (NTRS)

    Kwiatkowski, Krzysztof C. (Inventor); Hayes, Ryan T. (Inventor); Magnuson, James W. (Inventor); Giletto, Anthony (Inventor)

    2008-01-01

    A device for the transdermal delivery of a therapeutic agent to a biological subject that includes a first electrode comprising a first array of electrically conductive microprojections for providing electrical communication through a skin portion of the subject to a second electrode comprising a second array of electrically conductive microprojections. Additionally, a reservoir for holding the therapeutic agent surrounding the first electrode and a pulse generator for providing an exponential decay pulse between the first and second electrodes may be provided. A method includes the steps of piercing a stratum corneum layer of skin with two arrays of conductive microprojections, encapsulating the therapeutic agent into biocompatible charged carriers, surrounding the conductive microprojections with the therapeutic agent, generating an exponential decay pulse between the two arrays of conductive microprojections to create a non-uniform electrical field and electrokinetically driving the therapeutic agent through the stratum corneum layer of skin.

  19. Therapeutic cloning: promises and issues

    PubMed Central

    Kfoury, Charlotte

    2007-01-01

    Advances in biotechnology necessitate both an understanding of scientific principles and ethical implications to be clinically applicable in medicine. In this regard, therapeutic cloning offers significant potential in regenerative medicine by circumventing immunorejection, and in the cure of genetic disorders when used in conjunction with gene therapy. Therapeutic cloning in the context of cell replacement therapy holds a huge potential for de novo organogenesis and the permanent treatment of Parkinson’s disease, Duchenne muscular dystrophy, and diabetes mellitus as shown by in vivo studies. Scientific roadblocks impeding advancement in therapeutic cloning are tumorigenicity, epigenetic reprogramming, mitochondrial heteroplasmy, interspecies pathogen transfer, low oocyte availability. Therapeutic cloning is also often tied to ethical considerations concerning the source, destruction and moral status of IVF embryos based on the argument of potential. Legislative and funding issues are also addressed. Future considerations would include a distinction between therapeutic and reproductive cloning in legislative formulations. PMID:18523539

  20. Metrics for antibody therapeutics development.

    PubMed

    Reichert, Janice M

    2010-01-01

    A wide variety of full-size monoclonal antibodies (mAbs) and therapeutics derived from alternative antibody formats can be produced through genetic and biological engineering techniques. These molecules are now filling the preclinical and clinical pipelines of every major pharmaceutical company and many biotechnology firms. Metrics for the development of antibody therapeutics, including averages for the number of candidates entering clinical study and development phase lengths for mAbs approved in the United States, were derived from analysis of a dataset of over 600 therapeutic mAbs that entered clinical study sponsored, at least in part, by commercial firms. The results presented provide an overview of the field and context for the evaluation of on-going and prospective mAb development programs. The expansion of therapeutic antibody use through supplemental marketing approvals and the increase in the study of therapeutics derived from alternative antibody formats are discussed. PMID:20930555

  1. Model-Based Radiation Dose Correction for Yttrium-90 Microsphere Treatment of Liver Tumors With Central Necrosis

    SciTech Connect

    Liu, Ching-Sheng; Lin, Ko-Han; Lee, Rheun-Chuan; Tseng, Hsiou-Shan; Wang, Ling-Wei; Huang, Pin-I; Chao, Liung-Sheau; Chang, Cheng-Yen; Yen, Sang-Hue; Tung, Chuan-Jong; Wang, Syh-Jen; Oliver Wong, Ching-yee

    2011-11-01

    Purpose: The objectives of this study were to model and calculate the absorbed fraction {phi} of energy emitted from yttrium-90 ({sup 90}Y) microsphere treatment of necrotic liver tumors. Methods and Materials: The tumor necrosis model was proposed for the calculation of {phi} over the spherical shell region. Two approaches, the semianalytic method and the probabilistic method, were adopted. In the former method, the range--energy relationship and the sampling of electron paths were applied to calculate the energy deposition within the target region, using the straight-ahead and continuous-slowing-down approximation (CSDA) method. In the latter method, the Monte Carlo PENELOPE code was used to verify results from the first method. Results: The fraction of energy, {phi}, absorbed from {sup 90}Y by 1-cm thickness of tumor shell from microsphere distribution by CSDA with complete beta spectrum was 0.832 {+-} 0.001 and 0.833 {+-} 0.001 for smaller (r{sub T} = 5 cm) and larger (r{sub T} = 10 cm) tumors (where r is the radii of the tumor [T] and necrosis [N]). The fraction absorbed depended mainly on the thickness of the tumor necrosis configuration, rather than on tumor necrosis size. The maximal absorbed fraction {phi} that occurred in tumors without central necrosis for each size of tumor was different: 0.950 {+-} 0.000, and 0.975 {+-} 0.000 for smaller (r{sub T} = 5 cm) and larger (r{sub T} = 10 cm) tumors, respectively (p < 0.0001). Conclusions: The tumor necrosis model was developed for dose calculation of {sup 90}Y microsphere treatment of hepatic tumors with central necrosis. With this model, important information is provided regarding the absorbed fraction applicable to clinical {sup 90}Y microsphere treatment.

  2. When Does Model-Based Control Pay Off?

    PubMed Central

    2016-01-01

    Many accounts of decision making and reinforcement learning posit the existence of two distinct systems that control choice: a fast, automatic system and a slow, deliberative system. Recent research formalizes this distinction by mapping these systems to “model-free” and “model-based” strategies in reinforcement learning. Model-free strategies are computationally cheap, but sometimes inaccurate, because action values can be accessed by inspecting a look-up table constructed through trial-and-error. In contrast, model-based strategies compute action values through planning in a causal model of the environment, which is more accurate but also more cognitively demanding. It is assumed that this trade-off between accuracy and computational demand plays an important role in the arbitration between the two strategies, but we show that the hallmark task for dissociating model-free and model-based strategies, as well as several related variants, do not embody such a trade-off. We describe five factors that reduce the effectiveness of the model-based strategy on these tasks by reducing its accuracy in estimating reward outcomes and decreasing the importance of its choices. Based on these observations, we describe a version of the task that formally and empirically obtains an accuracy-demand trade-off between model-free and model-based strategies. Moreover, we show that human participants spontaneously increase their reliance on model-based control on this task, compared to the original paradigm. Our novel task and our computational analyses may prove important in subsequent empirical investigations of how humans balance accuracy and demand. PMID:27564094

  3. Model-based Roentgen stereophotogrammetry of orthopaedic implants.

    PubMed

    Valstar, E R; de Jong, F W; Vrooman, H A; Rozing, P M; Reiber, J H

    2001-06-01

    Attaching tantalum markers to prostheses for Roentgen stereophotogrammetry (RSA) may be difficult and is sometimes even impossible. In this study, a model-based RSA method that avoids the attachment of markers to prostheses is presented and validated. This model-based RSA method uses a triangulated surface model of the implant. A projected contour of this model is calculated and this calculated model contour is matched onto the detected contour of the actual implant in the RSA radiograph. The difference between the two contours is minimized by variation of the position and orientation of the model. When a minimal difference between the contours is found, an optimal position and orientation of the model has been obtained. The method was validated by means of a phantom experiment. Three prosthesis components were used in this experiment: the femoral and tibial component of an Interax total knee prosthesis (Stryker Howmedica Osteonics Corp., Rutherfort, USA) and the femoral component of a Profix total knee prosthesis (Smith & Nephew, Memphis, USA). For the prosthesis components used in this study, the accuracy of the model-based method is lower than the accuracy of traditional RSA. For the Interax femoral and tibial components, significant dimensional tolerances were found that were probably caused by the casting process and manual polishing of the components surfaces. The largest standard deviation for any translation was 0.19mm and for any rotation it was 0.52 degrees. For the Profix femoral component that had no large dimensional tolerances, the largest standard deviation for any translation was 0.22mm and for any rotation it was 0.22 degrees. From this study we may conclude that the accuracy of the current model-based RSA method is sensitive to dimensional tolerances of the implant. Research is now being conducted to make model-based RSA less sensitive to dimensional tolerances and thereby improving its accuracy. PMID:11470108

  4. Hybrid OPC technique using model based and rule-based flows

    NASA Astrophysics Data System (ADS)

    Harb, Mohammed; Abdelghany, Hesham

    2013-04-01

    To transfer an electronic circuit from design to silicon, a lot of stages are involved in between. As technology evolves, the design shapes are getting closer to each other. Since the wavelength of the lithography process didn't get any better than 193nm, optical interference is a problem that needs to be accounted for by using Optical Proximity Correction (OPC) algorithms. In earlier technologies, simple OPC was applied to the design based on spatial rules. This is not the situation in the recent technologies anymore, since more optical interference took place with the intensive scaling down of the designs. Model-based OPC is a better solution now to produce accurate results, but this comes at the cost of the increased run time. Electronic Design Automation (EDA) companies compete to offer tools that provide both accuracy and run time efficiency. In this paper, we show that optimum usage of some of these tools can ensure OPC accuracy with better run time. The hybrid technique of OPC uses the classic rule-based OPC in a modern fashion to consider the optical parameters, instead of the spatial metrics only. Combined with conventional model-based OPC, the whole flow shows better results in terms of accuracy and run time.

  5. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model based accelerator control started at SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical change with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed these application programs for a fourth time. This time, however, the programs we are developing are generic so that we will not have to do it again. We have developed an integrated system called GOLD (Generic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  6. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    SciTech Connect

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model-based accelerator control started at SPEAR. Since that time nearly all accelerator beamlines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP and SLC all use different control programs. Since many of these application programs are embedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed an integrated system called GOLD (Genetic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs.

  7. Adaptive model-based control systems and methods for controlling a gas turbine

    NASA Technical Reports Server (NTRS)

    Brunell, Brent Jerome (Inventor); Mathews, Jr., Harry Kirk (Inventor); Kumar, Aditya (Inventor)

    2004-01-01

    Adaptive model-based control systems and methods are described so that performance and/or operability of a gas turbine in an aircraft engine, power plant, marine propulsion, or industrial application can be optimized under normal, deteriorated, faulted, failed and/or damaged operation. First, a model of each relevant system or component is created, and the model is adapted to the engine. Then, if/when deterioration, a fault, a failure or some kind of damage to an engine component or system is detected, that information is input to the model-based control as changes to the model, constraints, objective function, or other control parameters. With all the information about the engine condition, and state and directives on the control goals in terms of an objective function and constraints, the control then solves an optimization so the optimal control action can be determined and taken. This model and control may be updated in real-time to account for engine-to-engine variation, deterioration, damage, faults and/or failures using optimal corrective control action command(s).

  8. StarPlan: A model-based diagnostic system for spacecraft

    NASA Technical Reports Server (NTRS)

    Heher, Dennis; Pownall, Paul

    1990-01-01

    The Sunnyvale Division of Ford Aerospace created a model-based reasoning capability for diagnosing faults in space systems. The approach employs reasoning about a model of the domain (as it is designed to operate) to explain differences between expected and actual telemetry; i.e., to identify the root cause of the discrepancy (at an appropriate level of detail) and determine necessary corrective action. A development environment, named Paragon, was implemented to support both model-building and reasoning. The major benefit of the model-based approach is the capability for the intelligent system to handle faults that were not anticipated by a human expert. The feasibility of this approach for diagnosing problems in a spacecraft was demonstrated in a prototype system, named StarPlan. Reasoning modules within StarPlan detect anomalous telemetry, establish goals for returning the telemetry to nominal values, and create a command plan for attaining the goals. Before commands are implemented, their effects are simulated to assure convergence toward the goal. After the commands are issued, the telemetry is monitored to assure that the plan is successful. These features of StarPlan, along with associated concerns, issues and future directions, are discussed.

  9. [Crohn's disease. Clinical and therapeutical considerations].

    PubMed

    Paşalega, M; Calotă, F; Paraliov, T; Meşină, C; Vîlcea, D; Tomescu, P; Pănuş, A; Tenea, T; Mirea, C; Traşcă, E; Ene, D; Vasile, I

    2005-01-01

    Crohn's disease is a chronic granulomatous inflammatory condition of the intestinal tract of unknown etiology. Most commonly the disease affects the small bowel, the colon and the rectum. The acute and aggressive forms can evolve fast, mimicking an acute surgical illness, requiring surgical intervention in emergency. Surgical therapeutical option, in this condition, must be determined strictly by establishing a correct intraoperative diagnosis, through macroscopic features and histologic evidence. Because it is an incurable disease with variable evolution, marked by recurrence, that involves repeated surgical intervention, the surgical treatment (often resection), must be most conservative from the small bowel. We present 3 cases of surgical interventions with emergency characteristics (bowel obstruction through fitobezoar, colonic tumors obstruction of colon splenic angle, urachal infected tumors). In these cases the diagnosis was established intraoperatively and the surgical intervention was adapted to the particular cases. PMID:16372678

  10. When not to trust therapeutic drug monitoring

    PubMed Central

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-01-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  11. When not to trust therapeutic drug monitoring.

    PubMed

    Westergreen-Thorne, Mathew; Lee, Sook Yan; Shah, Nilesh; Dodd, Alan

    2016-09-01

    Therapeutic drug monitoring (TDM) is the measurement of serum or plasma drug concentration to allow the individualization of dosing. We describe the case of a patient who was prescribed inappropriately large doses of vancomycin due to inaccurate TDM. Specifically, our laboratory reported progressively lower vancomycin concentrations despite dose increases. Eventually, when duplicate samples were sent to a different laboratory vancomycin concentrations were found to be in the toxic range. We hypothesize this was due to the patient generating immunoglobulin antibodies against her infection that interfered with the original TDM immunoassay. Immunogenic TDM interference has been known to rarely occur in patients with immune related comorbidities; however, if we are correct, this is a unique case as this patient did not have such a background. This case illustrates the importance of using clinical judgement when interpreting TDM as, in this case, substantial harm to the patient was likely only narrowly avoided. PMID:27606069

  12. [Lithiasis and ectopic pelvic kidney. Therapeutic aspects].

    PubMed

    Aboutaieb, R; Rabii, R; el Moussaoui, A; Joual, A; Sarf, I; el Mrini, M; Benjelloun, S

    1996-01-01

    Kidney in ectopic position is dysplasic, and associated to other malformations. The advent of a lithiasis in these conditions rises questions about therapeutic options. We report on five observations of pelvic ectopic kidney with urinary lithiasis. Patients were aged from 16 to 42 years. Kidney was non functional in two cases, or with normal appearance sized 10 to 12 cm. We performed total nephrectomy in two cases, pyelolithotomy in the other cases. Surgical approach was subperitoneal via iliac route. A dismembered pyeloplasty was associated in one case. All patients did well. Radiologic control at 6 and 12 months showed no recurrence in a well functioning kidney. Surgical lithotomy is advocated as a treatment in urinary lithiasis affecting ectopic kidney. It is an easy procedure which permits correction of other associated malformations. PMID:9833030

  13. Assessing the correctional orientation of corrections officers in South Korea.

    PubMed

    Moon, Byongook; Maxwell, Sheila Royo

    2004-12-01

    The correctional goal in South Korea has recently changed from the straightforward punishment of inmates to rehabilitation. Currently, emphases are being placed on education, counseling, and other treatment programs. These changes have consequently begun to also change the corrections officers' roles from a purely custodial role to a human service role, in which officers are expected to manage rehabilitation and treatment programs. Despite these changes, few studies have examined the attitudes of corrections officers toward rehabilitation programming. This is an important dimension to examine in rehabilitation programming, as corrections officers play a major role in the delivery of institutional programs. This study examines the attitudes of South Korean corrections officers toward rehabilitation programs. Approximately 430 corrections officers were sampled. Results show that correctional attitudes are largely influenced by not only officers' own motivations for joining corrections but also by institutional factors such as job stress. Policy implications are discussed. PMID:15538029

  14. Therapeutic postprostatectomy irradiation.

    PubMed

    Youssef, Emad; Forman, Jeffrey D; Tekyi-Mensah, Samuel; Bolton, Susan; Hart, Kim

    2002-06-01

    involvement, pathological stage, surgical margin, and perineural invasion. Upon multivariate analysis, only preradiation therapy PSA (P < 0.001) and the PSA trend during radiation therapy (P < 0.001) were significant factors. The results of therapeutic radiation for patients with elevated postprostatectomy PSA levels are sufficiently poor; other strategies should be explored as alternatives, including early adjuvant postprostatectomy irradiation or the use of combined hormonal and radiation therapy in the salvage situation. PMID:15046710

  15. Timebias corrections to predictions

    NASA Technical Reports Server (NTRS)

    Wood, Roger; Gibbs, Philip

    1993-01-01

    The importance of an accurate knowledge of the time bias corrections to predicted orbits to a satellite laser ranging (SLR) observer, especially for low satellites, is highlighted. Sources of time bias values and the optimum strategy for extrapolation are discussed from the viewpoint of the observer wishing to maximize the chances of getting returns from the next pass. What is said may be seen as a commercial encouraging wider and speedier use of existing data centers for mutually beneficial exchange of time bias data.

  16. The Aberration Corrected SEM

    SciTech Connect

    Joy, David C.

    2005-09-09

    The performance of the conventional low-energy CD-SEM is limited by the aberrations inherent in the probe forming lens. Multi-pole correctors are now available which can reduce or eliminate these aberrations. An SEM equipped with such a corrector offers higher spatial resolution and more probe current from a given electron source, and other aspects of the optical performance are also improved, but the much higher numerical aperture associated with an aberration corrected lens results in a reduction in imaging depth of field.

  17. Non-linear control logics for vibrations suppression: a comparison between model-based and non-model-based techniques

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Orsini, Lorenzo; Resta, Ferruccio

    2015-04-01

    Non-linear behavior is present in many mechanical system operating conditions. In these cases, a common engineering practice is to linearize the equation of motion around a particular operating point, and to design a linear controller. The main disadvantage is that the stability properties and validity of the controller are local. In order to improve the controller performance, non-linear control techniques represent a very attractive solution for many smart structures. The aim of this paper is to compare non-linear model-based and non-model-based control techniques. In particular the model-based sliding-mode-control (SMC) technique is considered because of its easy implementation and the strong robustness of the controller even under heavy model uncertainties. Among the non-model-based control techniques, the fuzzy control (FC), allowing designing the controller according to if-then rules, has been considered. It defines the controller without a system reference model, offering many advantages such as an intrinsic robustness. These techniques have been tested on the pendulum nonlinear system.

  18. Using Online Annotations to Support Error Correction and Corrective Feedback

    ERIC Educational Resources Information Center

    Yeh, Shiou-Wen; Lo, Jia-Jiunn

    2009-01-01

    Giving feedback on second language (L2) writing is a challenging task. This research proposed an interactive environment for error correction and corrective feedback. First, we developed an online corrective feedback and error analysis system called "Online Annotator for EFL Writing". The system consisted of five facilities: Document Maker,…

  19. [Corrected transposition of the great arteries].

    PubMed

    Alva-Espinosa, Carlos

    2016-01-01

    Corrected transposition of the great arteries is one of the most fascinating entities in congenital heart disease. The apparent corrected condition is only temporal. Over time, most patients develop systemic heart failure, even in the absence of associated lesions. With current imaging studies, precise visualization is achieved in each case though the treatment strategy remains unresolved. In asymptomatic patients or cases without associated lesions, focalized follow-up to assess systemic ventricular function and the degree of tricuspid valve regurgitation is important. In cases with normal ventricular function and mild tricuspid failure, it seems unreasonable to intervene surgically. In patients with significant associated lesions, surgery is indicated. In the long term, the traditional approach may not help tricuspid regurgitation and systemic ventricular failure. Anatomical correction is the proposed alternative to ease the right ventricle overload and to restore the systemic left ventricular function. However, this is a prolonged operation and not without risks and long-term complications. In this review the clinical, diagnostic, and therapeutic aspects are overviewed in the light of the most significant and recent literature. PMID:27335197

  20. Development of novel activin-targeted therapeutics.

    PubMed

    Chen, Justin L; Walton, Kelly L; Al-Musawi, Sara L; Kelly, Emily K; Qian, Hongwei; La, Mylinh; Lu, Louis; Lovrecz, George; Ziemann, Mark; Lazarus, Ross; El-Osta, Assam; Gregorevic, Paul; Harrison, Craig A

    2015-03-01

    Soluble activin type II receptors (ActRIIA/ActRIIB), via binding to diverse TGF-β proteins, can increase muscle and bone mass, correct anemia or protect against diet-induced obesity. While exciting, these multiple actions of soluble ActRIIA/IIB limit their therapeutic potential and highlight the need for new reagents that target specific ActRIIA/IIB ligands. Here, we modified the activin A and activin B prodomains, regions required for mature growth factor synthesis, to generate specific activin antagonists. Initially, the prodomains were fused to the Fc region of mouse IgG2A antibody and, subsequently, "fastener" residues (Lys(45), Tyr(96), His(97), and Ala(98); activin A numbering) that confer latency to other TGF-β proteins were incorporated. For the activin A prodomain, these modifications generated a reagent that potently (IC(50) 5 nmol/l) and specifically inhibited activin A signaling in vitro, and activin A-induced muscle wasting in vivo. Interestingly, the modified activin B prodomain inhibited both activin A and B signaling in vitro (IC(50) ~2 nmol/l) and in vivo, suggesting it could serve as a general activin antagonist. Importantly, unlike soluble ActRIIA/IIB, the modified prodomains did not inhibit myostatin or GDF-11 activity. To underscore the therapeutic utility of specifically antagonising activin signaling, we demonstrate that the modified activin prodomains promote significant increases in muscle mass. PMID:25399825

  1. Clinical, epidemiological, and therapeutic profile of dermatophytosis*

    PubMed Central

    Pires, Carla Andréa Avelar; da Cruz, Natasha Ferreira Santos; Lobato, Amanda Monteiro; de Sousa, Priscila Oliveira; Carneiro, Francisca Regina Oliveira; Mendes, Alena Margareth Darwich

    2014-01-01

    BACKGROUND The cutaneous mycoses, mainly caused by dermatophyte fungi, are among the most common fungal infections worldwide. It is estimated that 10% to 15% of the population will be infected by a dermatophyte at some point in their lives, thus making this a group of diseases with great public health importance. OBJECTIVE To analyze the clinical, epidemiological, and therapeutic profile of dermatophytosis in patients enrolled at the Dermatology service of Universidade do Estado do Pará, Brazil, from July 2010 to September 2012. METHOD A total of 145 medical records of patients diagnosed with dermatophytosis were surveyed. Data were collected and subsequently recorded according to a protocol developed by the researchers. This protocol consisted of information regarding epidemiological and clinical aspects of the disease and the therapy employed. RESULTS The main clinical form of dermatophyte infection was onychomycosis, followed by tinea corporis, tinea pedis, and tinea capitis. Furthermore, the female population and the age group of 51 to 60 years were the most affected. Regarding therapy, there was a preference for treatments that combine topical and systemic drugs, and the most widely used drugs were fluconazole (systemic) and ciclopirox olamine (topical). CONCLUSION This study showed the importance of recurrent analysis of the epidemiological profile of dermatophytosis to enable correct therapeutic and preventive management of these conditions, which have significant clinical consequences, with chronic, difficult-totreat lesions that can decrease patient quality of life and cause disfigurement. PMID:24770502

  2. Therapeutic approaches for spinal cord injury

    PubMed Central

    Cristante, Alexandre Fogaça; de Barros Filho, Tarcísio Eloy Pessoa; Marcon, Raphael Martus; Letaif, Olavo Biraghi; da Rocha, Ivan Dias

    2012-01-01

    This study reviews the literature concerning possible therapeutic approaches for spinal cord injury. Spinal cord injury is a disabling and irreversible condition that has high economic and social costs. There are both primary and secondary mechanisms of damage to the spinal cord. The primary lesion is the mechanical injury itself. The secondary lesion results from one or more biochemical and cellular processes that are triggered by the primary lesion. The frustration of health professionals in treating a severe spinal cord injury was described in 1700 BC in an Egyptian surgical papyrus that was translated by Edwin Smith; the papyrus reported spinal fractures as a “disease that should not be treated.” Over the last two decades, several studies have been performed to obtain more effective treatments for spinal cord injury. Most of these studies approach a patient with acute spinal cord injury in one of four manners: corrective surgery or a physical, biological or pharmacological treatment method. Science is unraveling the mechanisms of cell protection and neuroregeneration, but clinically, we only provide supportive care for patients with spinal cord injuries. By combining these treatments, researchers attempt to enhance the functional recovery of patients with spinal cord injuries. Advances in the last decade have allowed us to encourage the development of experimental studies in the field of spinal cord regeneration. The combination of several therapeutic strategies should, at minimum, allow for partial functional recoveries for these patients, which could improve their quality of life. PMID:23070351

  3. Nucleic acids as therapeutic agents.

    PubMed

    Alvarez-Salas, Luis M

    2008-01-01

    Therapeutic nucleic acids (TNAs) and its precursors are applied to treat several pathologies and infections. TNA-based therapy has different rationales and mechanisms and can be classified into three main groups: 1) Therapeutic nucleotides and nucleosides; 2) Therapeutic oligonucleotides; and 3) Therapeutic polynucleotides. This review will focus in those TNAs that have reached clinical trials with anticancer and antiviral protocols, the two most common applications of TNAs. Although therapeutic nucleotides and nucleosides that interfere with nucleic acid metabolism and DNA polymerization have been successfully used as anticancer and antiviral drugs, they often produce toxic secondary effects related to dosage and continuous use. The use of oligonucleotides such as ribozyme and antisense oligodeoxynucleotides (AS-ODNs) showed promise as therapeutic moieties but faced several issues such as nuclease sensitivity, off-target effects and efficient delivery. Nevertheless, immunostimulatory oligodeoxynucleotides and AS-ODNs represent the most successful group of therapeutic oligonucleotides in the clinic. A newer group of therapeutic oligonucleotides, the aptamers, is rapidly advancing towards early detection and treatment alternatives the have reached the commercial interest. Despite the very high in vitro efficiency of small interfering RNAs (siRNAs) they present issues with intracellular target accessibility, specificity and delivery. DNA vaccines showed great promise, but they resulted in very poor responses in the clinic and further development is uncertain. Despite their many issues, the exquisite specificity and versatility of therapeutic oligonucleotides attracts a great deal of research and resources that will certainly convert them in the TNA of choice for treating cancer and viral diseases in the near future. PMID:18991725

  4. OCT Motion Correction

    NASA Astrophysics Data System (ADS)

    Kraus, Martin F.; Hornegger, Joachim

    From the introduction of time domain OCT [1] up to recent swept source systems, motion continues to be an issue in OCT imaging. In contrast to normal photography, an OCT image does not represent a single point in time. Instead, conventional OCT devices sequentially acquire one-dimensional data over a period of several seconds, capturing one beam of light at a time and recording both the intensity and delay of reflections along its path through an object. In combination with unavoidable object motion which occurs in many imaging contexts, the problem of motion artifacts lies in the very nature of OCT imaging. Motion artifacts degrade image quality and make quantitative measurements less reliable. Therefore, it is desirable to come up with techniques to measure and/or correct object motion during OCT acquisition. In this chapter, we describe the effect of motion on OCT data sets and give an overview on the state of the art in the field of retinal OCT motion correction.

  5. Worldwide radiosonde temperature corrections

    SciTech Connect

    Luers, J.; Eskridge, R.

    1997-11-01

    Detailed heat transfer analyses have been performed on ten of the world`s most commonly used radiosondes from 1960 to present. These radiosondes are the USA VIZ and Space Data, the Vaisala RS-80, RS-185/21, and RS12/15, the Japanese RS2-80, Russian MARS, RKZ, and A22, and the Chinese GZZ. The temperature error of each radiosonde has been calculated as a function of altitude and the sonde and environmental parameters that influence its magnitude. Computer models have been developed that allow the correction of temperature data from each sonde as a function of these parameters. Recommendations are made concerning the use of data from each of the radiosondes for climate studies. For some radiosondes, nighttime data requires no corrections. Other radiosondes require that day and daytime data is not feasible because parameters of significance, such as balloon rise rate, are not retrievable. The results from this study provide essential information for anyone attempting to perform climate studies using radiosonde data. 6 refs., 1 tab.

  6. Turbulence compressibility corrections

    NASA Technical Reports Server (NTRS)

    Coakley, T. J.; Horstman, C. C.; Marvin, J. G.; Viegas, J. R.; Bardina, J. E.; Huang, P. G.; Kussoy, M. I.

    1994-01-01

    The basic objective of this research was to identify, develop and recommend turbulence models which could be incorporated into CFD codes used in the design of the National AeroSpace Plane vehicles. To accomplish this goal, a combined effort consisting of experimental and theoretical phases was undertaken. The experimental phase consisted of a literature survey to collect and assess a database of well documented experimental flows, with emphasis on high speed or hypersonic flows, which could be used to validate turbulence models. Since it was anticipated that this database would be incomplete and would need supplementing, additional experiments in the NASA Ames 3.5-Foot Hypersonic Wind Tunnel (HWT) were also undertaken. The theoretical phase consisted of identifying promising turbulence models through applications to simple flows, and then investigating more promising models in applications to complex flows. The complex flows were selected from the database developed in the first phase of the study. For these flows it was anticipated that model performance would not be entirely satisfactory, so that model improvements or corrections would be required. The primary goals of the investigation were essentially achieved. A large database of flows was collected and assessed, a number of additional hypersonic experiments were conducted in the Ames HWT, and two turbulence models (kappa-epsilon and kappa-omega models with corrections) were determined which gave superior performance for most of the flows studied and are now recommended for NASP applications.

  7. Smooth eigenvalue correction

    NASA Astrophysics Data System (ADS)

    Hendrikse, Anne; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-12-01

    Second-order statistics play an important role in data modeling. Nowadays, there is a tendency toward measuring more signals with higher resolution (e.g., high-resolution video), causing a rapid increase of dimensionality of the measured samples, while the number of samples remains more or less the same. As a result the eigenvalue estimates are significantly biased as described by the Marčenko Pastur equation for the limit of both the number of samples and their dimensionality going to infinity. By introducing a smoothness factor, we show that the Marčenko Pastur equation can be used in practical situations where both the number of samples and their dimensionality remain finite. Based on this result we derive methods, one already known and one new to our knowledge, to estimate the sample eigenvalues when the population eigenvalues are known. However, usually the sample eigenvalues are known and the population eigenvalues are required. We therefore applied one of the these methods in a feedback loop, resulting in an eigenvalue bias correction method. We compare this eigenvalue correction method with the state-of-the-art methods and show that our method outperforms other methods particularly in real-life situations often encountered in biometrics: underdetermined configurations, high-dimensional configurations, and configurations where the eigenvalues are exponentially distributed.

  8. Contact Lenses for Vision Correction

    MedlinePlus

    ... Contact Lenses Colored Contact Lenses Contact Lenses for Vision Correction Written by: Kierstan Boyd Reviewed by: Brenda ... on the surface of the eye. They correct vision like eyeglasses do and are safe when used ...

  9. Thermal corrections to Electroweak Decays

    NASA Astrophysics Data System (ADS)

    Masood, Samina

    2016-03-01

    We study the electroweak processes at finite temperatures. This includes the decay rates of electroweak gauge bosons and beta decays. Major thermal corrections come from QED type radiative corrections. Heavy mass of the electroweak gauge bosons helps to suppress the radiative corrections due to the electroweak gauge boson loops. Therefore, dominant thermal corrections are due to the photon loops. We also discuss the relevance of our results to astrophysics and cosmology.

  10. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  11. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  12. Model based document and report generation for systems engineering

    NASA Astrophysics Data System (ADS)

    Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  13. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  14. Model-based hierarchical reinforcement learning and human action control

    PubMed Central

    Botvinick, Matthew; Weinstein, Ari

    2014-01-01

    Recent work has reawakened interest in goal-directed or ‘model-based’ choice, where decisions are based on prospective evaluation of potential action outcomes. Concurrently, there has been growing attention to the role of hierarchy in decision-making and action control. We focus here on the intersection between these two areas of interest, considering the topic of hierarchical model-based control. To characterize this form of action control, we draw on the computational framework of hierarchical reinforcement learning, using this to interpret recent empirical findings. The resulting picture reveals how hierarchical model-based mechanisms might play a special and pivotal role in human decision-making, dramatically extending the scope and complexity of human behaviour. PMID:25267822

  15. Model-based decision support in diabetes care.

    PubMed

    Salzsieder, E; Vogt, L; Kohnert, K-D; Heinke, P; Augstein, P

    2011-05-01

    The model-based Karlsburg Diabetes Management System (KADIS®) has been developed as a patient-focused decision-support tool to provide evidence-based advice for physicians in their daily efforts to optimize metabolic control in diabetes care of their patients on an individualized basis. For this purpose, KADIS® was established in terms of a personalized, interactive in silico simulation procedure, implemented into a problem-related diabetes health care network and evaluated under different conditions by conducting open-label mono- and polycentric trials, and a case-control study, and last but not least, by application in routine diabetes outpatient care. The trial outcomes clearly show that the recommendations provided to the physicians by KADIS® lead to significant improvement of metabolic control. This model-based decision-support system provides an excellent tool to effectively guide physicians in personalized decision-making to achieve optimal metabolic control for their patients. PMID:20621384

  16. Adaptive, Model-Based Monitoring and Threat Detection

    NASA Astrophysics Data System (ADS)

    Valdes, Alfonso; Skinner, Keith

    2002-09-01

    We explore the suitability of model-based probabilistic techniques, such as Bayes networks, to the field of intrusion detection and alert report correlation. We describe a network intrusion detection system (IDS) using Bayes inference, wherein the knowledge base is encoded not as rules but as conditional probability relations between observables and hypotheses of normal and malicious usage. The same high-performance Bayes inference library was employed in a component of the Mission-Based Correlation effort, using an initial knowledge base that adaptively learns the security administrator's preference for alert priority and rank. Another major effort demonstrated probabilistic techniques in heterogeneous sensor correlation. We provide results for simulated attack data, live traffic, and the CyberPanel Grand Challenge Problem. Our results establish that model-based probabilistic techniques are an important complementary capability to signature-based methods in detection and correlation.

  17. Hierarchical model-based interferometric synthetic aperture radar image registration

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Huang, Haifeng; Dong, Zhen; Wu, Manqing

    2014-01-01

    With the rapid development of spaceborne interferometric synthetic aperture radar technology, classical image registration methods are incompetent for high-efficiency and high-accuracy masses of real data processing. Based on this fact, we propose a new method. This method consists of two steps: coarse registration that is realized by cross-correlation algorithm and fine registration that is realized by hierarchical model-based algorithm. Hierarchical model-based algorithm is a high-efficiency optimization algorithm. The key features of this algorithm are a global model that constrains the overall structure of the motion estimated, a local model that is used in the estimation process, and a coarse-to-fine refinement strategy. Experimental results from different kinds of simulated and real data have confirmed that the proposed method is very fast and has high accuracy. Comparing with a conventional cross-correlation method, the proposed method provides markedly improved performance.

  18. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

  19. 3-D model-based tracking for UAV indoor localization.

    PubMed

    Teulière, Céline; Marchand, Eric; Eck, Laurent

    2015-05-01

    This paper proposes a novel model-based tracking approach for 3-D localization. One main difficulty of standard model-based approach lies in the presence of low-level ambiguities between different edges. In this paper, given a 3-D model of the edges of the environment, we derive a multiple hypotheses tracker which retrieves the potential poses of the camera from the observations in the image. We also show how these candidate poses can be integrated into a particle filtering framework to guide the particle set toward the peaks of the distribution. Motivated by the UAV indoor localization problem where GPS signal is not available, we validate the algorithm on real image sequences from UAV flights. PMID:25099967

  20. Model-based inversion for a shallow ocean application

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1994-03-01

    A model-based approach to invert or estimate the sound speed profile (SSP) from noisy pressure-field measurements is discussed. The resulting model-based processor (MBP) is based on the state-space representation of the normal-mode propagation model. Using data obtained from the well-known Hudson Canyon experiment, a noisy shallow water ocean environment, the processor is designed and the results compared to those predicted using various propagation models and data. It is shown that the MBP not only predicts the sound speed quite well, but also is able to simultaneously provide enhanced estimates of both modal and pressure-field measurements which are useful for localization and rapid ocean environmental characterization.

  1. Automated extraction of knowledge for model-based diagnostics

    NASA Technical Reports Server (NTRS)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  2. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  3. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. PMID:26521723

  4. Model based control of a rehabilitation robot for lower extremities.

    PubMed

    Xie, Xiao-Liang; Hou, Zeng-Guang; Li, Peng-Feng; Ji, Cheng; Zhang, Feng; Tan, Min; Wang, Hongbo; Hu, Guoqing

    2010-01-01

    This paper mainly focuses on the trajectory tracking control of a lower extremity rehabilitation robot during passive training process of patients. Firstly, a mathematical model of the rehabilitation robot is introduced by using Lagrangian analysis. Then, a model based computed-torque control scheme is designed to control the constrained four-link robot (with patient's foot fixed on robot's end-effector) to track a predefined trajectory. Simulation results are provided to illustrate the effectiveness of the proposed model based computed-torque algorithm. In the simulation, a multi-body dynamics and motion software named ADAMS is used. The combined simulation of ADAMS and MATLAB is able to produce more realistic results of this complex integrated system. PMID:21097222

  5. REAL-TIME MODEL-BASED ELECTRICAL POWERED WHEELCHAIR CONTROL

    PubMed Central

    Wang, Hongwu; Salatin, Benjamin; Grindle, Garrett G.; Ding, Dan; Cooper, Rory A.

    2009-01-01

    The purpose of this study was to evaluate the effects of three different control methods on driving speed variation and wheel-slip of an electric-powered wheelchair (EPW). A kinematic model as well as 3-D dynamic model was developed to control the velocity and traction of the wheelchair. A smart wheelchair platform was designed and built with a computerized controller and encoders to record wheel speeds and to detect the slip. A model based, a proportional-integral-derivative (PID) and an open-loop controller were applied with the EPW driving on four different surfaces at three specified speeds. The speed errors, variation, rise time, settling time and slip coefficient were calculated and compared for a speed step-response input. Experimental results showed that model based control performed best on all surfaces across the speeds. PMID:19733494

  6. MTK: An AI tool for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  7. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  8. Yearbook of Correctional Education 1989.

    ERIC Educational Resources Information Center

    Duguid, Stephen, Ed.

    This yearbook contains conference papers, commissioned papers, reprints of earlier works, and research-in-progress. They offer a retrospective view as well as address the mission and perspective of correctional education, its international dimension, correctional education in action, and current research. Papers include "Correctional Education and…

  9. Job Satisfaction in Correctional Officers.

    ERIC Educational Resources Information Center

    Diehl, Ron J.

    For more than a decade, correctional leaders throughout the country have attempted to come to grips with the basic issues involved in ascertaining and meeting the needs of correctional institutions. This study investigated job satisfaction in 122 correctional officers employed in both rural and urban prison locations for the State of Kansas…

  10. Hypercondylia: problems in diagnosis and therapeutic indications.

    PubMed

    Cervelli, Valerio; Bottini, Davide Johan; Arpino, Alessia; Trimarco, Anna; Cervelli, Giulio; Mugnaini, Francesco

    2008-03-01

    Condylar hyperplasia is a rare disease that alters the anatomy of one of the two condylus of the mandible as a consequence of the abnormal growth of the interested condylus and results in facial asymmetry and functional problems at puberty. Both sexes are affected. Two cases were analyzed, including an active and an inactive nucleus. Histopathologically, two types are distinguished: active and inactive hypercondylia. Diagnosis is usually achieved through clinical and radiographic approach and is utterly important to distinguish the consequent therapeutic approach. Thanks to the development of new diagnostic techniques, it is possible to quantify in advance the risk of skeletal relapse in case of hyperactive nucleus. This study points out the need to identify the type of hyperplasty to treat as active or inactive. Surgery should be based on scintigraphy results to evaluate active center of ossification. Clinical manifestations depend on the date of onset of this pathology and the speed of growth of the anomaly. We underline the importance of a precise diagnosis, clinical and instrumental, to plan the correct treatment achieved through a correct anamnesis, a standardized diagnostic protocol, supported by the indispensable scintigraphy. PMID:18362718

  11. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  12. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  13. Model-based rational strategy for chromatographic resin selection.

    PubMed

    Nfor, Beckley K; Zuluaga, Diego S; Verheijen, Peter J T; Verhaert, Peter D E M; van der Wielen, Luuk A M; Ottens, Marcel

    2011-01-01

    A model-based rational strategy for the selection of chromatographic resins is presented. The main question being addressed is that of selecting the most optimal chromatographic resin from a few promising alternatives. The methodology starts with chromatographic modeling,parameters acquisition, and model validation, followed by model-based optimization of the chromatographic separation for the resins of interest. Finally, the resins are rationally evaluated based on their optimized operating conditions and performance metrics such as product purity, yield, concentration, throughput, productivity, and cost. Resin evaluation proceeds by two main approaches. In the first approach, Pareto frontiers from multi-objective optimization of conflicting objectives are overlaid for different resins, enabling direct visualization and comparison of resin performances based on the feasible solution space. The second approach involves the transformation of the resin performances into weighted resin scores, enabling the simultaneous consideration of multiple performance metrics and the setting of priorities. The proposed model-based resin selection strategy was illustrated by evaluating three mixed mode adsorbents (ADH, PPA, and HEA) for the separation of a ternary mixture of bovine serum albumin, ovalbumin, and amyloglucosidase. In order of decreasing weighted resin score or performance, the top three resins for this separation were ADH [PPA[HEA. The proposed model-based approach could be a suitable alternative to column scouting during process development, the main strengths being that minimal experimentation is required and resins are evaluated under their ideal working conditions, enabling a fair comparison. This work also demonstrates the application of column modeling and optimization to mixed mode chromatography. PMID:22238769

  14. On Environmental Model-Based Visual Perception for Humanoids

    NASA Astrophysics Data System (ADS)

    Gonzalez-Aguirre, D.; Wieland, S.; Asfour, T.; Dillmann, R.

    In this article an autonomous visual perception framework for humanoids is presented. This model-based framework exploits the available knowledge and the context acquired during global localization in order to overcome the limitations of pure data-driven approaches. The reasoning for perception and the properceptive components are the key elements to solve complex visual assertion queries with a proficient performance. Experimental evaluation with the humanoid robot ARMAR-IIIa is presented.

  15. Model based control of dynamic atomic force microscope

    SciTech Connect

    Lee, Chibum; Salapaka, Srinivasa M.

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  16. Model based control of dynamic atomic force microscope.

    PubMed

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments. PMID:25933864

  17. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  18. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  19. A Model-Based Expert System For Digital Systems Design

    NASA Astrophysics Data System (ADS)

    Wu, J. G.; Ho, W. P. C.; Hu, Y. H.; Yun, D. Y. Y.; Parng, T. M.

    1987-05-01

    In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.

  20. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate. PMID:26995351

  1. Mesenchymal Stem Cells as Therapeutics

    PubMed Central

    Parekkadan, Biju; Milwid, Jack M.

    2013-01-01

    Mesenchymal stem cells (MSCs) are multipotent cells that are being clinically explored as a new therapeutic for treating a variety of immune-mediated diseases. First heralded as a regenerative therapy for skeletal tissue repair, MSCs have recently been shown to modulate endogenous tissue and immune cells. Preclinical studies of the mechanism of action suggest that the therapeutic effects afforded by MSC transplantation are short-lived and related to dynamic, paracrine interactions between MSCs and host cells. Therefore, representations of MSCs as drug-loaded particles may allow for pharmacokinetic models to predict the therapeutic activity of MSC transplants as a function of drug delivery mode. By integrating principles of MSC biology, therapy, and engineering, the field is armed to usher in the next generation of stem cell therapeutics. PMID:20415588

  2. Relativistic quantum corrections to laser wakefield acceleration.

    PubMed

    Zhu, Jun; Ji, Peiyong

    2010-03-01

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons. PMID:20365881

  3. Relativistic quantum corrections to laser wakefield acceleration

    SciTech Connect

    Zhu Jun; Ji Peiyong

    2010-03-15

    The influence of quantum effects on the interaction of intense laser fields with plasmas is investigated by using a hydrodynamic model based on the framework of the relativistic quantum theory. Starting from the covariant Wigner function and Dirac equation, the hydrodynamic equations for relativistic quantum plasmas are derived. Based on the relativistic quantum hydrodynamic equations and Poisson equation, the perturbations of electron number densities and the electric field of the laser wakefield containing quantum effects are deduced. It is found that the corrections generated by the quantum effects to the perturbations of electron number densities and the accelerating field of the laser wakefield cannot be neglected. Quantum effects will suppress laser wakefields, which is a classical manifestation of quantum decoherence effects, however, the contribution of quantum effects for the laser wakefield correction will been partially counteracted by the relativistic effects. The analysis also reveals that quantum effects enlarge the effective frequencies of plasmas, and the quantum behavior appears a screening effect for plasma electrons.

  4. EDITORIAL: Politically correct physics?

    NASA Astrophysics Data System (ADS)

    Pople Deputy Editor, Stephen

    1997-03-01

    If you were a caring, thinking, liberally minded person in the 1960s, you marched against the bomb, against the Vietnam war, and for civil rights. By the 1980s, your voice was raised about the destruction of the rainforests and the threat to our whole planetary environment. At the same time, you opposed discrimination against any group because of race, sex or sexual orientation. You reasoned that people who spoke or acted in a discriminatory manner should be discriminated against. In other words, you became politically correct. Despite its oft-quoted excesses, the political correctness movement sprang from well-founded concerns about injustices in our society. So, on balance, I am all for it. Or, at least, I was until it started to invade science. Biologists were the first to feel the impact. No longer could they refer to 'higher' and 'lower' orders, or 'primitive' forms of life. To the list of undesirable 'isms' - sexism, racism, ageism - had been added a new one: speciesism. Chemists remained immune to the PC invasion, but what else could you expect from a group of people so steeped in tradition that their principal unit, the mole, requires the use of the thoroughly unreconstructed gram? Now it is the turn of the physicists. This time, the offenders are not those who talk disparagingly about other people or animals, but those who refer to 'forms of energy' and 'heat'. Political correctness has evolved into physical correctness. I was always rather fond of the various forms of energy: potential, kinetic, chemical, electrical, sound and so on. My students might merge heat and internal energy into a single, fuzzy concept loosely associated with moving molecules. They might be a little confused at a whole new crop of energies - hydroelectric, solar, wind, geothermal and tidal - but they could tell me what devices turned chemical energy into electrical energy, even if they couldn't quite appreciate that turning tidal energy into geothermal energy wasn't part of the

  5. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  6. Electronic measurement correction devices

    SciTech Connect

    Mahns, R.R.

    1984-04-01

    The electronics semi-conductor revolution has touched every industry and home in the nation. The gas industry is no exception. Sophisticated gas measurement instrumentation has been with us for several decades now, but only in the last 10 years or so has it really begun to boom. First marketed were the flow computers dedicated to orifice meter measurement; but with steadily decreasing manufacturing costs, electronic instrumentation is now moving into the area of base volume, pressure and temperature correction previously handled almost solely by mechanical integrating instruments. This paper takes a brief look at some of the features of the newcomers on the market and how they stack up against the old standby mechanical base volume/pressure/temperature correctors.

  7. Therapeutic Vaccines for Chronic Infections

    NASA Astrophysics Data System (ADS)

    Autran, Brigitte; Carcelain, Guislaine; Combadiere, Béhazine; Debre, Patrice

    2004-07-01

    Therapeutic vaccines aim to prevent severe complications of a chronic infection by reinforcing host defenses when some immune control, albeit insufficient, can already be demonstrated and when a conventional antimicrobial therapy either is not available or has limited efficacy. We focus on the rationale and challenges behind this still controversial strategy and provide examples from three major chronic infectious diseases-human immunodeficiency virus, hepatitis B virus, and human papillomavirus-for which the efficacy of therapeutic vaccines is currently being evaluated.

  8. [Therapeutic touch and anorexia nervosa].

    PubMed

    Satori, Nadine

    2016-01-01

    An innovative practice, therapeutic touch has been used for around ten years in the treatment of eating disorders. Delivered by nurse clinicians having received specific training, this approach is based on nursing diagnoses which identify the major symptoms of this pathology. The support is built around the body and its perceptions. Through the helping relationship, it mobilises the patient's resources to favour a relationship of trust, a letting-go, physical, psychological and emotional relaxation, and improves the therapeutic alliance. PMID:27615696

  9. Using rule-based shot dose assignment in model-based MPC applications

    NASA Astrophysics Data System (ADS)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  10. Elastic therapeutic tape: do they have the same material properties?

    PubMed Central

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-01-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex®, ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight® 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young’s modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young’s modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape. PMID:27190472

  11. Elastic therapeutic tape: do they have the same material properties?

    PubMed

    Boonkerd, Chuanpis; Limroongreungrat, Weerawat

    2016-04-01

    [Purpose] Elastic therapeutic tape has been widely used for rehabilitation and treatment of sports injuries. Tapes with different elastic properties serve different treatment purposes with inappropriate tension reducing tape effectiveness. Many tapes are available in the market, but studies on tape properties are limited. The aim of this study was to examine the material properties of elastic therapeutic tape. [Subjects and Methods] Brands of elastic therapeutic tape included KinesioTex(®), ATex, Mueller, 3M, and ThaiTape. The Material Testing System Insight(®) 1 Electromechanical Testing Systems was used to apply a tensile force on elastic therapeutic tape. Ten specimens of each brand were tested. Stress, load, and Young's modulus at 25%, 50%, 75%, 100%, and maximum point were collected. One-way analysis of variance with post hoc testing was used to analyze tape parameters. [Results] Maximum elongation and Young's modulus at all percentages were significantly different between brands. There were no differences in maximum load and maximum stress. [Conclusion] Mechanical properties are different for commercial elastic therapeutic tapes. Physiotherapists and other clinicians should be aware of mechanical tape properties to correctly apply kinesio tape. PMID:27190472

  12. Algebraic Flux Correction II

    NASA Astrophysics Data System (ADS)

    Kuzmin, Dmitri; Möller, Matthias; Gurris, Marcel

    Flux limiting for hyperbolic systems requires a careful generalization of the design principles and algorithms introduced in the context of scalar conservation laws. In this chapter, we develop FCT-like algebraic flux correction schemes for the Euler equations of gas dynamics. In particular, we discuss the construction of artificial viscosity operators, the choice of variables to be limited, and the transformation of antidiffusive fluxes. An a posteriori control mechanism is implemented to make the limiter failsafe. The numerical treatment of initial and boundary conditions is discussed in some detail. The initialization is performed using an FCT-constrained L 2 projection. The characteristic boundary conditions are imposed in a weak sense, and an approximate Riemann solver is used to evaluate the fluxes on the boundary. We also present an unconditionally stable semi-implicit time-stepping scheme and an iterative solver for the fully discrete problem. The results of a numerical study indicate that the nonlinearity and non-differentiability of the flux limiter do not inhibit steady state convergence even in the case of strongly varying Mach numbers. Moreover, the convergence rates improve as the pseudo-time step is increased.

  13. Thermodynamics of Error Correction

    NASA Astrophysics Data System (ADS)

    Sartori, Pablo; Pigolotti, Simone

    2015-10-01

    Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  14. Labyrinth walking in corrections.

    PubMed

    Zucker, Donna M; Sharma, Amy

    2012-02-01

    A 6 week labyrinth walking program was pilot tested in a correctional setting and goals were to: 1) determine the feasibility of a labyrinth walking curriculum; 2) pilot test measures of health related quality of life (QOL) (pre and post-surveys) and blood pressure; and 3) examine the influence of relationship-centered teaching on subject satisfaction. Relational communication was used as a framework for this study, emphasizing concepts of trust, competency and similarly in the teacher. A pretest/posttest descriptive design was used. The sample was 14 offenders at a Massachusetts county jail. The intervention included six 90 minute sessions, composed of a lecture, a labyrinth walk, and journal writing. Measures included a demographic survey; pre and post session walk blood pressures; pre and post program QOL measures; and a post program measure of satisfaction. The sample was 57% Caucasian, 36% Hispanic, and 7% African American, with an average age of 34, mostly high school educated and single. Drug of choice was alcohol with age of use at 12 and 1/2 years. Seventy-nine percent were previously incarcerated more than twice. QOL data were not changed pre to post. BP data trended in a healthy direction from weeks 1 to 6. Satisfaction with the teacher and the program was high. The labyrinth walking pilot program was proven feasible, low cost and satisfying for the participants. Recommendations for future studies are discussed. PMID:22468660

  15. Model based control of polymer composite manufacturing processes

    NASA Astrophysics Data System (ADS)

    Potaraju, Sairam

    2000-10-01

    The objective of this research is to develop tools that help process engineers design, analyze and control polymeric composite manufacturing processes to achieve higher productivity and cost reduction. Current techniques for process design and control of composite manufacturing suffer from the paucity of good process models that can accurately represent these non-linear systems. Existing models developed by researchers in the past are designed to be process and operation specific, hence generating new simulation models is time consuming and requires significant effort. To address this issue, an Object Oriented Design (OOD) approach is used to develop a component-based model building framework. Process models for two commonly used industrial processes (Injected Pultrusion and Autoclave Curing) are developed using this framework to demonstrate the flexibility. Steady state and dynamic validation of this simulator is performed using a bench scale injected pultrusion process. This simulator could not be implemented online for control due to computational constraints. Models that are fast enough for online implementation, with nearly the same degree of accuracy are developed using a two-tier scheme. First, lower dimensional models that captures essential resin flow, heat transfer and cure kinetics important from a process monitoring and control standpoint are formulated. The second step is to reduce these low dimensional models to Reduced Order Models (ROM) suited for online model based estimation, control and optimization. Model reduction is carried out using Proper Orthogonal Decomposition (POD) technique in conjunction with a Galerkin formulation procedure. Subsequently, a nonlinear model-based estimation and inferential control scheme based on the ROM is implemented. In particular, this research work contributes in the following general areas: (1) Design and implementation of versatile frameworks for modeling and simulation of manufacturing processes using object

  16. Broadband model-based processing for shallow ocean environments

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1998-07-01

    Most acoustic sources found in the ocean environment are spatially complex and broadband. In the case of shallow water propagation, these source characteristics complicate the analysis of received acoustic data considerably. A common approach to the broadband problem is to decompose the received signal into a set of narrow-band lines. This then allows the problem to be treated as a multiplicity of narrow-band problems. Here a model-based approach is developed for the processing of data received on a vertical array from a broadband source where it is assumed that the propagation is governed by the normal-mode model. The goal of the processor is to provide an enhanced (filtered) version of the pressure at the array and the modal functions. Thus a pre-processor is actually developed, since one could think of several applications for these enhanced quantities such as localization, modal estimation, etc. It is well-known that in normal-mode theory a different modal structure evolves for each temporal frequency; thus it is not surprising that the model-based solution to this problem results in a scheme that requires a {open_quotes}bank{close_quotes} of narrow-band model-based processors{emdash}each with its own underlying modal structure for the narrow frequency band it operates over. The {open_quotes}optimal{close_quotes} Bayesian solution to the broadband pressure field enhancement and modal function extraction problem is developed. It is shown how this broadband processor can be implemented (using a suboptimal scheme) in pseudo real time due to its inherent parallel structure. A set of noisy broadband data is synthesized to demonstrate how to construct the processor and achieve a minimum variance (optimal Bayesian) design. It is shown that both broadband pressure-field and modal function estimates can be extracted illustrating the feasibility of this approach. {copyright} {ital 1998 Acoustical Society of America.}

  17. The Design of Model-Based Training Programs

    NASA Technical Reports Server (NTRS)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  18. A model-based multisensor data fusion knowledge management approach

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  19. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  20. Model-Based Information Extraction From Synthetic Aperture Radar Signals

    NASA Astrophysics Data System (ADS)

    Matzner, Shari A.

    2011-07-01

    Synthetic aperture radar (SAR) is a remote sensing technology for imaging areas of the earth's surface. SAR has been successfully used for monitoring characteristics of the natural environment such as land cover type and tree density. With the advent of higher resolution sensors, it is now theoretically possible to extract information about individual structures such as buildings from SAR imagery. This information could be used for disaster response and security-related intelligence. SAR has an advantage over other remote sensing technologies for these applications because SAR data can be collected during the night and in rainy or cloudy conditions. This research presents a model-based method for extracting information about a building -- its height and roof slope -- from a single SAR image. Other methods require multiple images or ancillary data from specialized sensors, making them less practical. The model-based method uses simulation to match a hypothesized building to an observed SAR image. The degree to which a simulation matches the observed data is measured by mutual information. The success of this method depends on the accuracy of the simulation and on the reliability of the mutual information similarity measure. Electromagnetic theory was applied to relate a building's physical characteristics to the features present in a SAR image. This understanding was used to quantify the precision of building information contained in SAR data, and to identify the inputs needed for accurate simulation. A new SAR simulation technique was developed to meet the accuracy and efficiency requirements of model-based information extraction. Mutual information, a concept from information theory, has become a standard for measuring the similarity between medical images. Its performance in the context of matching a simulation image to a SAR image was evaluated in this research, and it was found to perform well under certain conditions. The factors that affect its performance

  1. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  2. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1. PMID:21142522

  3. A parametric vocal fold model based on magnetic resonance imaging.

    PubMed

    Wu, Liang; Zhang, Zhaoyan

    2016-08-01

    This paper introduces a parametric three-dimensional body-cover vocal fold model based on magnetic resonance imaging (MRI) of the human larynx. Major geometric features that are observed in the MRI images but missing in current vocal fold models are discussed, and their influence on vocal fold vibration is evaluated using eigenmode analysis. Proper boundary conditions for the model are also discussed. Based on control parameters corresponding to anatomic landmarks that can be easily measured, this model can be adapted toward a subject-specific vocal fold model for voice production research and clinical applications. PMID:27586774

  4. A model-based executive for commanding robot teams

    NASA Technical Reports Server (NTRS)

    Barrett, Anthony

    2005-01-01

    The paper presents a way to robustly command a system of systems as a single entity. Instead of modeling each component system in isolation and then manually crafting interaction protocols, this approach starts with a model of the collective population as a single system. By compiling the model into separate elements for each component system and utilizing a teamwork model for coordination, it circumvents the complexities of manually crafting robust interaction protocols. The resulting systems are both globally responsive by virtue of a team oriented interaction model and locally responsive by virtue of a distributed approach to model-based fault detection, isolation, and recovery.

  5. Spring-Model-Based Wireless Localization in Cooperative User Environments

    NASA Astrophysics Data System (ADS)

    Ke, Wei; Wu, Lenan; Qi, Chenhao

    To overcome the shortcomings of conventional cellular positioning, a novel cooperative location algorithm that uses the available peer-to-peer communication between the mobile terminals (MTs) is proposed. The main idea behind the proposed approach is to incorporate the long- and short-range location information to improve the estimation of the MT's coordinates. Since short-range communications among MTs are characterized by high line-of-sight (LOS) probability, an improved spring-model-based cooperative location method can be exploited to provide low-cost improvement for cellular-based location in the non-line-of-sight (NLOS) environments.

  6. Evaluating model accuracy for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Roden, Joseph

    1992-01-01

    Described here is an approach to automatically assessing the accuracy of various components of a model. In this approach, actual data from the operation of a target system is used to drive statistical measures to evaluate the prediction accuracy of various portions of the model. We describe how these statistical measures of model accuracy can be used in model-based reasoning for monitoring and design. We then describe the application of these techniques to the monitoring and design of the water recovery system of the Environmental Control and Life Support System (ECLSS) of Space Station Freedom.

  7. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  8. Gravitational correction to vacuum polarization

    NASA Astrophysics Data System (ADS)

    Jentschura, U. D.

    2015-02-01

    We consider the gravitational correction to (electronic) vacuum polarization in the presence of a gravitational background field. The Dirac propagators for the virtual fermions are modified to include the leading gravitational correction (potential term) which corresponds to a coordinate-dependent fermion mass. The mass term is assumed to be uniform over a length scale commensurate with the virtual electron-positron pair. The on-mass shell renormalization condition ensures that the gravitational correction vanishes on the mass shell of the photon, i.e., the speed of light is unaffected by the quantum field theoretical loop correction, in full agreement with the equivalence principle. Nontrivial corrections are obtained for off-shell, virtual photons. We compare our findings to other works on generalized Lorentz transformations and combined quantum-electrodynamic gravitational corrections to the speed of light which have recently appeared in the literature.

  9. [Therapeutic potential of optogenetic neuromodulation].

    PubMed

    Vandecasteele, Marie; Senova, Yann-Suhan; Palfi, Stéphane; Dugué, Guillaume P

    2015-04-01

    Optogenetic neuromodulation techniques, which have emerged during the last 15 years, have considerably enhanced our ability to probe the functioning of neural circuits by allowing the excitation and inhibition of genetically-defined neuronal populations using light. Having gained tremendous popularity in the field of fundamental neuroscience, these techniques are now opening new therapeutic avenues. Optogenetic neuromodulation is a method of choice for studying the physiopathology of neurological and neuropsychiatric disorders in a range of animal models, and could accelerate the discovery of new therapeutic strategies. New therapeutic protocols employing optogenetic neuromodulation may also emerge in the near future, offering promising alternative approaches for disorders which lack appropriate treatments, such as pharmacoresistant epilepsy and inherited retinal degeneration. PMID:25958759

  10. Therapeutic cloning and reproductive liberty.

    PubMed

    Sparrow, Robert

    2009-04-01

    Concern for "reproductive liberty" suggests that decisions about embryos should normally be made by the persons who would be the genetic parents of the child that would be brought into existence if the embryo were brought to term. Therapeutic cloning would involve creating and destroying an embryo, which, if brought to term, would be the offspring of the genetic parents of the person undergoing therapy. I argue that central arguments in debates about parenthood and genetics therefore suggest that therapeutic cloning would be prima facie unethical unless it occurred with the consent of the parents of the person being cloned. Alternatively, if therapeutic cloning is thought to be legitimate, this undermines the case for some uses of reproductive cloning by implying that the genetic relation it establishes between clones and DNA donors does not carry the same moral weight as it does in cases of normal reproduction. PMID:19240247

  11. Oligonucleotide conjugates for therapeutic applications

    PubMed Central

    Winkler, Johannes

    2013-01-01

    Insufficient pharmacokinetic properties and poor cellular uptake are the main hurdles for successful therapeutic development of oligonucleotide agents. The covalent attachment of various ligands designed to influence the biodistribution and cellular uptake or for targeting specific tissues is an attractive possibility to advance therapeutic applications and to expand development options. In contrast to advanced formulations, which often consist of multiple reagents and are sensitive to a variety of preparation conditions, oligonucleotide conjugates are defined molecules, enabling structure-based analytics and quality control techniques. This review gives an overview of current developments of oligonucleotide conjugates for therapeutic applications. Attached ligands comprise peptides, proteins, carbohydrates, aptamers and small molecules, including cholesterol, tocopherol and folic acid. Important linkage types and conjugation methods are summarized. The distinct ligands directly influence biochemical parameters, uptake machanisms and pharmacokinetic properties. PMID:23883124

  12. Proximity corrected accurate in-die registration metrology

    NASA Astrophysics Data System (ADS)

    Daneshpanah, M.; Laske, F.; Wagner, M.; Roeth, K.-D.; Czerkas, S.; Yamaguchi, H.; Fujii, N.; Yoshikawa, S.; Kanno, K.; Takamizawa, H.

    2014-07-01

    193nm immersion lithography is the mainstream production technology for the 20nm and 14nm logic nodes. Multi-patterning of an increasing number of critical layers puts extreme pressure on wafer intra-field overlay, to which mask registration error is a major contributor [1]. The International Technology Roadmap for Semiconductors (ITRS [2]) requests a registration error below 4 nm for each mask of a multi-patterning set forming one layer on the wafer. For mask metrology at the 20nm and 14nm logic nodes, maintaining a precision-to-tolerance (P/T) ratio below 0.25 will be very challenging. Full characterization of mask registration errors in the active area of the die will become mandatory. It is well-known that differences in pattern density and asymmetries in the immediate neighborhood of a feature give rise to apparent shifts in position when measured by optical metrology systems, so-called optical proximity effects. These effects can easily be similar in magnitude to real mask placement errors, and uncorrected can result in mis-qualification of the mask. Metrology results from KLA-Tencor's next generation mask metrology system are reported, applying a model-based algorithm [3] which includes corrections for proximity errors. The proximity corrected, model-based measurements are compared to standard measurements and a methodology presented that verifies the correction performance of the new algorithm.

  13. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  14. The Earned-Time System: A Performance-Based Correctional Management Model.

    ERIC Educational Resources Information Center

    Nosin, Jerome Alan

    Utilizing a social learning approach the Georgia Department of Offender Rehabilitation has implemented a performance-based correctional management model based on the assumption that only self-rehabilitation is viable. Earned Time System (ETS) provides resources and motivational opportunities for inmates to assume personal responsibility for their…

  15. Therapeutic target database update 2014: a resource for targeted therapeutics

    PubMed Central

    Qin, Chu; Zhang, Cheng; Zhu, Feng; Xu, Feng; Chen, Shang Ying; Zhang, Peng; Li, Ying Hong; Yang, Sheng Yong; Wei, Yu Quan; Tao, Lin; Chen, Yu Zong

    2014-01-01

    Here we describe an update of the Therapeutic Target Database (http://bidd.nus.edu.sg/group/ttd/ttd.asp) for better serving the bench-to-clinic communities and for enabling more convenient data access, processing and exchange. Extensive efforts from the research, industry, clinical, regulatory and management communities have been collectively directed at the discovery, investigation, application, monitoring and management of targeted therapeutics. Increasing efforts have been directed at the development of stratified and personalized medicines. These efforts may be facilitated by the knowledge of the efficacy targets and biomarkers of targeted therapeutics. Therefore, we added search tools for using the International Classification of Disease ICD-10-CM and ICD-9-CM codes to retrieve the target, biomarker and drug information (currently enabling the search of almost 900 targets, 1800 biomarkers and 6000 drugs related to 900 disease conditions). We added information of almost 1800 biomarkers for 300 disease conditions and 200 drug scaffolds for 700 drugs. We significantly expanded Therapeutic Target Database data contents to cover >2300 targets (388 successful and 461 clinical trial targets), 20 600 drugs (2003 approved and 3147 clinical trial drugs), 20 000 multitarget agents against almost 400 target-pairs and the activity data of 1400 agents against 300 cell lines. PMID:24265219

  16. QCD corrections to triboson production

    NASA Astrophysics Data System (ADS)

    Lazopoulos, Achilleas; Melnikov, Kirill; Petriello, Frank

    2007-07-01

    We present a computation of the next-to-leading order QCD corrections to the production of three Z bosons at the Large Hadron Collider. We calculate these corrections using a completely numerical method that combines sector decomposition to extract infrared singularities with contour deformation of the Feynman parameter integrals to avoid internal loop thresholds. The NLO QCD corrections to pp→ZZZ are approximately 50% and are badly underestimated by the leading order scale dependence. However, the kinematic dependence of the corrections is minimal in phase space regions accessible at leading order.

  17. Entropic Corrections to Coulomb's Law

    NASA Astrophysics Data System (ADS)

    Hendi, S. H.; Sheykhi, A.

    2012-04-01

    Two well-known quantum corrections to the area law have been introduced in the literatures, namely, logarithmic and power-law corrections. Logarithmic corrections, arises from loop quantum gravity due to thermal equilibrium fluctuations and quantum fluctuations, while, power-law correction appears in dealing with the entanglement of quantum fields in and out the horizon. Inspired by Verlinde's argument on the entropic force, and assuming the quantum corrected relation for the entropy, we propose the entropic origin for the Coulomb's law in this note. Also we investigate the Uehling potential as a radiative correction to Coulomb potential in 1-loop order and show that for some value of distance the entropic corrections of the Coulomb's law is compatible with the vacuum-polarization correction in QED. So, we derive modified Coulomb's law as well as the entropy corrected Poisson's equation which governing the evolution of the scalar potential ϕ. Our study further supports the unification of gravity and electromagnetic interactions based on the holographic principle.

  18. In Situ Mosaic Brightness Correction

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.; Lorre, Jean J.

    2012-01-01

    In situ missions typically have pointable, mast-mounted cameras, which are capable of taking panoramic mosaics comprised of many individual frames. These frames are mosaicked together. While the mosaic software applies radiometric correction to the images, in many cases brightness/contrast seams still exist between frames. This is largely due to errors in the radiometric correction, and the absence of correction for photometric effects in the mosaic processing chain. The software analyzes the overlaps between adjacent frames in the mosaic and determines correction factors for each image in an attempt to reduce or eliminate these brightness seams.

  19. Ocean acoustic signal processing: A model-based approach

    SciTech Connect

    Candy, J.V. ); Sullivan, E.J. )

    1992-12-01

    A model-based approach is proposed to solve the ocean acoustic signal processing problem that is based on a state-space representation of the normal-mode propagation model. It is shown that this representation can be utilized to spatially propagate both modal (depth) and range functions given the basic parameters (wave numbers, etc.) developed from the solution of the associated boundary value problem. This model is then generalized to the stochastic case where an approximate Gauss--Markov model evolves. The Gauss--Markov representation, in principle, allows the inclusion of stochastic phenomena such as noise and modeling errors in a consistent manner. Based on this framework, investigations are made of model-based solutions to the signal enhancement, detection and related parameter estimation problems. In particular, a modal/pressure field processor is designed that allows {ital in} {ital situ} recursive estimation of the sound velocity profile. Finally, it is shown that the associated residual or so-called innovation sequence that ensues from the recursive nature of this formulation can be employed to monitor the model's fit to the data and also form the basis of a sequential detector.

  20. Evaluation of Model-Based Training for Vertical Guidance Logic

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.

  1. Model-based diagnosis of a carbon dioxide removal assembly

    NASA Astrophysics Data System (ADS)

    Throop, David R.; Scarl, Ethan A.

    1992-03-01

    Model-based diagnosis (MBD) has been applied to a variety of mechanisms, but few of these have been in fluid flow domains. Important mechanism variables in these domains are continuous, and the mechanisms commonly contain complex recycle patterns. These properties violate some of the common assumptions for MBD. The CO2 removal assembly (CDRA) for the cabin atmosphere aboard NASA's Space Station Freedom is such a mechanism. Early work on diagnosing similar mechanisms showed that purely associative diagnostic systems could not adequately handle these mechanisms' frequent reconfigurations. This suggested a model-based approach and KATE was adapted to the domain. KATE is a constraint-based MBD shell. It has been successfully applied to liquid flow problems in handling liquid oxygen. However, that domain does not involve complex recycle streams, but the CDRA does. KATE had solved constraint sets by propagating parameter values through constraints; this method often fails on constraints sets which describe recycle systems. KATE was therefore extended to allow it to use external algebraic programs to solve its constraint sets. This paper describes the representational challenges involved in that extension, and describes adaptions which allowed KATE to work within the representational limitations imposed by those algebraic programs. It also presents preliminary results of the CDRA modeling.

  2. Model-Based Diagnostics for Propellant Loading Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.

    2011-01-01

    The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.

  3. MODEL-BASED CLUSTERING OF LARGE NETWORKS1

    PubMed Central

    Vu, Duy Q.; Hunter, David R.; Schweinberger, Michael

    2015-01-01

    We describe a network clustering framework, based on finite mixture models, that can be applied to discrete-valued networks with hundreds of thousands of nodes and billions of edge variables. Relative to other recent model-based clustering work for networks, we introduce a more flexible modeling framework, improve the variational-approximation estimation algorithm, discuss and implement standard error estimation via a parametric bootstrap approach, and apply these methods to much larger data sets than those seen elsewhere in the literature. The more flexible framework is achieved through introducing novel parameterizations of the model, giving varying degrees of parsimony, using exponential family models whose structure may be exploited in various theoretical and algorithmic ways. The algorithms are based on variational generalized EM algorithms, where the E-steps are augmented by a minorization-maximization (MM) idea. The bootstrapped standard error estimates are based on an efficient Monte Carlo network simulation idea. Last, we demonstrate the usefulness of the model-based clustering framework by applying it to a discrete-valued network with more than 131,000 nodes and 17 billion edge variables. PMID:26605002

  4. Application of model based control to robotic manipulators

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  5. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  6. Model-based approach to real-time target detection

    NASA Astrophysics Data System (ADS)

    Hackett, Jay K.; Gold, Ed V.; Long, Daniel T.; Cloud, Eugene L.; Duvoisin, Herbert A.

    1992-09-01

    Land mine detection and extraction from infra-red (IR) scenes using real-time parallel processing is of significant interest to ground based infantry. The mine detection algorithms consist of several sub-processes to progress from raw input IR imagery to feature based mine nominations. Image enhancement is first applied; this consists of noise and sensor artifact removal. Edge grouping is used to determine the boundary of the objects. The generalized Hough Transform tuned to the land mine signature acts as a model based matched nomination filter. Once the object is found, the model is used to guide the labeling of each pixel as background, object, or object boundary. Using these labels to identify object regions, feature primitives are extracted in a high speed parallel processor. A feature based screener then compares each object's feature primitives to acceptable values and rejects all objects that do not resemble mines. This operation greatly reduces the number of objects that must be passed from a real-time parallel processor to the classifier. We will discuss details of this model- based approach, including results from actual IR field test imagery.

  7. Towards model-based control of Parkinson's disease

    PubMed Central

    Schiff, Steven J.

    2010-01-01

    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson’s disease is gaining increasing acceptance. Thus, the confluence of these three developments—control theory, computational neuroscience and deep brain stimulation—offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson’s disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development. PMID:20368246

  8. Model-based reconstructive elasticity imaging of deep venous thrombosis.

    PubMed

    Aglyamov, Salavat; Skovoroda, Andrei R; Rubin, Jonathan M; O'Donnell, Matthew; Emelianov, Stanislav Y

    2004-05-01

    Deep venous thrombosis (DVT) and its sequela, pulmonary embolism, is a significant clinical problem. Once detected, DVT treatment is based on the age of the clot. There are no good noninvasive methods, however, to determine clot age. Previously, we demonstrated that imaging internal mechanical strains can identify and possibly age thrombus in a deep vein. In this study the deformation geometry for DVT elasticity imaging and its effect on Young's modulus estimates is addressed. A model-based reconstruction method is presented to estimate elasticity in which the clot-containing vessel is modeled as a layered cylinder. Compared to an unconstrained approach in reconstructive elasticity imaging, the proposed model-based approach has several advantages: only one component of the strain tensor is used; the minimization procedure is very fast; the method is highly efficient because an analytic solution of the forward elastic problem is used; and the method is not very sensitive to the details of the external load pattern--a characteristic that is important for free-hand, external, surface-applied deformation. The approach was tested theoretically using a numerical model, and experimentally on both tissue-like phantoms and an animal model of DVT. Results suggest that elasticity reconstruction may prove to be a practical adjunct to triplex scanning to detect, diagnose, and stage DVT. PMID:15217230

  9. Qualitative-Modeling-Based Silicon Neurons and Their Networks

    PubMed Central

    Kohno, Takashi; Sekikawa, Munehisa; Li, Jing; Nanami, Takuya; Aihara, Kazuyuki

    2016-01-01

    The ionic conductance models of neuronal cells can finely reproduce a wide variety of complex neuronal activities. However, the complexity of these models has prompted the development of qualitative neuron models. They are described by differential equations with a reduced number of variables and their low-dimensional polynomials, which retain the core mathematical structures. Such simple models form the foundation of a bottom-up approach in computational and theoretical neuroscience. We proposed a qualitative-modeling-based approach for designing silicon neuron circuits, in which the mathematical structures in the polynomial-based qualitative models are reproduced by differential equations with silicon-native expressions. This approach can realize low-power-consuming circuits that can be configured to realize various classes of neuronal cells. In this article, our qualitative-modeling-based silicon neuron circuits for analog and digital implementations are quickly reviewed. One of our CMOS analog silicon neuron circuits can realize a variety of neuronal activities with a power consumption less than 72 nW. The square-wave bursting mode of this circuit is explained. Another circuit can realize Class I and II neuronal activities with about 3 nW. Our digital silicon neuron circuit can also realize these classes. An auto-associative memory realized on an all-to-all connected network of these silicon neurons is also reviewed, in which the neuron class plays important roles in its performance. PMID:27378842

  10. Model-based patterns in prostate cancer mortality worldwide

    PubMed Central

    Fontes, F; Severo, M; Castro, C; Lourenço, S; Gomes, S; Botelho, F; La Vecchia, C; Lunet, N

    2013-01-01

    Background: Prostate cancer mortality has been decreasing in several high income countries and previous studies analysed the trends mostly according to geographical criteria. We aimed to identify patterns in the time trends of prostate cancer mortality across countries using a model-based approach. Methods: Model-based clustering was used to identify patterns of variation in prostate cancer mortality (1980–2010) across 37 European, five non-European high-income countries and four leading emerging economies. We characterised the patterns observed regarding the geographical distribution and gross national income of the countries, as well as the trends observed in mortality/incidence ratios. Results: We identified three clusters of countries with similar variation in prostate cancer mortality: pattern 1 (‘no mortality decline'), characterised by a continued increase throughout the whole period; patterns 2 (‘later mortality decline') and 3 (‘earlier mortality decline') depict mortality declines, starting in the late and early 1990s, respectively. These clusters are also homogeneous regarding the variation in the prostate cancer mortality/incidence ratios, while are heterogeneous with reference to the geographical region of the countries and distribution of the gross national income. Conclusion: We provide a general model for the description and interpretation of the trends in prostate cancer mortality worldwide, based on three main patterns. PMID:23660943

  11. Model-based approach for human kinematics reconstruction from markerless and marker-based motion analysis systems.

    PubMed

    Sholukha, V; Bonnechere, B; Salvia, P; Moiseev, F; Rooze, M; Van Sint Jan, S

    2013-09-27

    Modeling tools related to the musculoskeletal system have been previously developed. However, the integration of the real underlying functional joint behavior is lacking and therefore available kinematic models do not reasonably replicate individual human motion. In order to improve our understanding of the relationships between muscle behavior, i.e. excursion and motion data, modeling tools must guarantee that the model of joint kinematics is correctly validated to ensure meaningful muscle behavior interpretation. This paper presents a model-based method that allows fusing accurate joint kinematic information with motion analysis data collected using either marker-based stereophotogrammetry (MBS) (i.e. bone displacement collected from reflective markers fixed on the subject's skin) or markerless single-camera (MLS) hardware. This paper describes a model-based approach (MBA) for human motion data reconstruction by a scalable registration method for combining joint physiological kinematics with limb segment poses. The presented results and kinematics analysis show that model-based MBS and MLS methods lead to physiologically-acceptable human kinematics. The proposed method is therefore available for further exploitation of the underlying model that can then be used for further modeling, the quality of which will depend on the underlying kinematic model. PMID:23972432

  12. Shadow-band correction for diffuse ultraviolet radiation measurements

    NASA Astrophysics Data System (ADS)

    SáNchez, G.; Serrano, A.; Cancillo, M. L.

    2013-05-01

    the correction of shadow-band solar total diffuse measurements has been extensively studied, the case of diffuse ultraviolet measurements has not been properly addressed. This study analyzes the correction factor to be applied to experimental measurements performed adapting a shadow-band to a UV radiometer at a radiometric station in Badajoz (Spain). Three different models, based on approaches widely used for correcting total diffuse measurements, have been revised and adapted for the ultraviolet spectral range. Results reveal that some aspects of the correction proposed for total diffuse radiation are not suitable for ultraviolet diffuse radiation. The mathematical expressions are consequently modified to match the behavior in the ultraviolet range. Thus, three correction models particularized for ultraviolet diffuse measurements are proposed and validated against experimental data. The two models adapted from the original expressions proposed by Battles et al., and Steven show the best performance, with rRMSE of 2.74% and 2.20% and rMBE of 1.53% and 0.46%, respectively.

  13. Therapy Talk: Analyzing Therapeutic Discourse

    ERIC Educational Resources Information Center

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  14. DYNAMIC EQUILIBRIUM IN THERAPEUTIC SITUATIONS.

    ERIC Educational Resources Information Center

    CARROLL, EDWARD J.

    THE CONCEPT OF DYNAMIC EQUILIBRIUM IS USED TO EXAMINE THE OCCURRENCE OF CHANGE IN A THERAPEUTIC INTERVIEW AND TO PROPOSE A THEORY OF THERAPY. BY ANALYZING THE WORKINGS OF THE PSYCHOSOCIAL SYSTEM THROUGH THE GENERAL SYSTEMS THEORY, IT IS POSSIBLE TO SEE HOW CHANGE OCCURS IN AN INDIVIDUAL FAMILY OR COMMUNITY. APPLIED TO A FAMILY INTERVIEW, THE MODEL…

  15. Therapeutic role of dietary fibre.

    PubMed Central

    Hunt, R.; Fedorak, R.; Frohlich, J.; McLennan, C.; Pavilanis, A.

    1993-01-01

    The current status of dietary fibre and fibre supplements in health and disease is reported, and the components of dietary fibre and its respective mechanical and metabolic effects with emphasis on its therapeutic potential are reviewed. Practical management guidelines are provided to help physicians encourage patients identified as having fibre deficiency to increase dietary fibre intake to the recommended level. PMID:8388284

  16. Scenario Writing: A Therapeutic Application.

    ERIC Educational Resources Information Center

    Haddock, Billy D.

    1989-01-01

    Introduces scenario writing as useful therapeutic technique. Presents case study of woman in midst of divorce and custody fight to illustrate context in which technique was applied. Suggests additional applications. Concludes that good response is more likely for clients who possess good writing skills although other clients may use their own…

  17. Model-Based Signal Processing: Correlation Detection With Synthetic Seismograms

    SciTech Connect

    Rodgers, A; Harris, D; Pasyanos, M; Blair, S; Matt, R

    2006-08-30

    Recent applications of correlation methods to seismological problems illustrate the power of coherent signal processing applied to seismic waveforms. Examples of these applications include detection of low amplitude signals buried in ambient noise and cross-correlation of sets of waveforms to form event clusters and accurately measure delay times for event relocation and/or earth structure. These methods rely on the exploitation of the similarity of individual waveforms and have been successfully applied to large sets of empirical observations. However, in cases with little or no empirical event data, such as aseismic regions or exotic event types, correlation methods with observed seismograms will not be possible due to the lack of previously observed similar waveforms. This study uses model-based signals computed for three-dimensional (3D) Earth models to form the basis for correlation detection. Synthetic seismograms are computed for fully 3D models estimated from the Markov Chain Monte-Carlo (MCMC) method. MCMC uses stochastic sampling to fit multiple seismological data sets. Rather than estimate a single ''optimal'' model, MCMC results in a suite of models that sample the model space and incorporates uncertainty through variability of the models. The variability reflects our ignorance of Earth structure, due to limited resolution, data and modeling errors, and produces variability in the seismic waveform response. Model-based signals are combined using a subspace method where the synthetic signals are decomposed into an orthogonal basis by singular-value decomposition (SVD) and the observed waveforms are represented with a linear combination of a sub-set of eigenvectors (signals) associated with the most significant eigenvalues. We have demonstrated the method by modeling long-period (80-10 seconds) regional seismograms for a moderate (M{approx}5) earthquake near the China-North Korea border. Synthetic seismograms are computed with the Spectral Element Method

  18. Fine-Tuning Corrective Feedback.

    ERIC Educational Resources Information Center

    Han, ZhaoHong

    2001-01-01

    Explores the notion of "fine-tuning" in connection with the corrective feedback process. Describes a longitudinal case study, conducted in the context of Norwegian as a second a language, that shows how fine-tuning and lack thereof in the provision of written corrective feedback differentially affects a second language learner's restructuring of…

  19. Barometric and Earth Tide Correction

    Energy Science and Technology Software Center (ESTSC)

    2005-11-10

    BETCO corrects for barometric and earth tide effects in long-term water level records. A regression deconvolution method is used ot solve a series of linear equations to determine an impulse response function for the well pressure head. Using the response function, a pressure head correction is calculated and applied.

  20. Correcting Slightly Less Simple Movements

    ERIC Educational Resources Information Center

    Aivar, M. P.; Brenner, E.; Smeets, J. B. J.

    2005-01-01

    Many studies have analysed how goal directed movements are corrected in response to changes in the properties of the target. However, only simple movements to single targets have been used in those studies, so little is known about movement corrections under more complex situations. Evidence from studies that ask for movements to several targets…

  1. Diamagnetic Corrections and Pascal's Constants

    ERIC Educational Resources Information Center

    Bain, Gordon A.; Berry, John F.

    2008-01-01

    Measured magnetic susceptibilities of paramagnetic substances must typically be corrected for their underlying diamagnetism. This correction is often accomplished by using tabulated values for the diamagnetism of atoms, ions, or whole molecules. These tabulated values can be problematic since many sources contain incomplete and conflicting data.…

  2. 75 FR 70951 - Notice, Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... the Federal Register on November 12, 2010, in FR Doc. 2010- 28647, on page 69473, correct Matters to... DISABILITY (NCD) Sunshine Act Meetings Notice, Correction Type: Quarterly Meeting. Summary: NCD published a Sunshine Act Meeting Notice in the Federal Register on November 12, 2010, notifying the public of...

  3. Shell corrections in stopping powers

    NASA Astrophysics Data System (ADS)

    Bichsel, H.

    2002-05-01

    One of the theories of the electronic stopping power S for fast light ions was derived by Bethe. The algorithm currently used for the calculation of S includes terms known as the mean excitation energy I, the shell correction, the Barkas correction, and the Bloch correction. These terms are described here. For the calculation of the shell corrections an atomic model is used, which is more realistic than the hydrogenic approximation used so far. A comparison is made with similar calculations in which the local plasma approximation is utilized. Close agreement with the experimental data for protons with energies from 0.3 to 10 MeV traversing Al and Si is found without the need for adjustable parameters for the shell corrections.

  4. Model-based condition monitoring for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Kim, Taesic; Wang, Yebin; Fang, Huazhen; Sahinoglu, Zafer; Wada, Toshihiro; Hara, Satoshi; Qiao, Wei

    2015-11-01

    Condition monitoring for batteries involves tracking changes in physical parameters and operational states such as state of health (SOH) and state of charge (SOC), and is fundamentally important for building high-performance and safety-critical battery systems. A model-based condition monitoring strategy is developed in this paper for Lithium-ion batteries on the basis of an electrical circuit model incorporating hysteresis effect. It systematically integrates 1) a fast upper-triangular and diagonal recursive least squares algorithm for parameter identification of the battery model, 2) a smooth variable structure filter for the SOC estimation, and 3) a recursive total least squares algorithm for estimating the maximum capacity, which indicates the SOH. The proposed solution enjoys advantages including high accuracy, low computational cost, and simple implementation, and therefore is suitable for deployment and use in real-time embedded battery management systems (BMSs). Simulations and experiments validate effectiveness of the proposed strategy.

  5. Performability modeling based on real data: A casestudy

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1987-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.

  6. Numerical analysis of modeling based on improved Elman neural network.

    PubMed

    Jie, Shao; Li, Wang; WeiSong, Zhao; YaQin, Zhong; Malekian, Reza

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance. PMID:25054172

  7. The algorithmic anatomy of model-based evaluation

    PubMed Central

    Daw, Nathaniel D.; Dayan, Peter

    2014-01-01

    Despite many debates in the first half of the twentieth century, it is now largely a truism that humans and other animals build models of their environments and use them for prediction and control. However, model-based (MB) reasoning presents severe computational challenges. Alternative, computationally simpler, model-free (MF) schemes have been suggested in the reinforcement learning literature, and have afforded influential accounts of behavioural and neural data. Here, we study the realization of MB calculations, and the ways that this might be woven together with MF values and evaluation methods. There are as yet mostly only hints in the literature as to the resulting tapestry, so we offer more preview than review. PMID:25267820

  8. Qualitative model-based diagnostics for rocket systems

    NASA Technical Reports Server (NTRS)

    Maul, William; Meyer, Claudia; Jankovsky, Amy; Fulton, Christopher

    1993-01-01

    A diagnostic software package is currently being developed at NASA LeRC that utilizes qualitative model-based reasoning techniques. These techniques can provide diagnostic information about the operational condition of the modeled rocket engine system or subsystem. The diagnostic package combines a qualitative model solver with a constraint suspension algorithm. The constraint suspension algorithm directs the solver's operation to provide valuable fault isolation information about the modeled system. A qualitative model of the Space Shuttle Main Engine's oxidizer supply components was generated. A diagnostic application based on this qualitative model was constructed to process four test cases: three numerical simulations and one actual test firing. The diagnostic tool's fault isolation output compared favorably with the input fault condition.

  9. CDMBE: A Case Description Model Based on Evidence.

    PubMed

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  10. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  11. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  12. Enhancements to the KATE model-based reasoning system

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1994-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. This report describes two software efforts which enhance the functionality and usability of KATE. The first addition, a flow solver, adds to KATE a tool for modeling the flow of liquid in a pipe system. The second addition adds support for editing KATE knowledge base files to the Emacs editor. The body of this report discusses design and implementation issues having to do with these two tools. It will be useful to anyone maintaining or extending either the flow solver or the editor enhancements.

  13. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  14. A social discounting model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2010-09-01

    Social decision making (e.g. social discounting and social preferences) has been attracting attention in economics, econophysics, social physics, behavioral psychology, and neuroeconomics. This paper proposes a novel social discounting model based on the deformed algebra developed in the Tsallis’ non-extensive thermostatistics. Furthermore, it is suggested that this model can be utilized to quantify the degree of consistency in social discounting in humans and analyze the relationships between behavioral tendencies in social discounting and other-regarding economic decision making under game-theoretic conditions. Future directions in the application of the model to studies in econophysics, neuroeconomics, and social physics, as well as real-world problems such as the supply of live organ donations, are discussed.

  15. Active Shape Model-Based Gait Recognition Using Infrared Images

    NASA Astrophysics Data System (ADS)

    Kim, Daehee; Lee, Seungwon; Paik, Joonki

    We present a gait recognition system using infra-red (IR) images. Since an IR camera is not affected by the intensity of illumination, it is able to provide constant recognition performance regardless of the amount of illumination. Model-based object tracking algorithms enable robust tracking with partial occlusions or dynamic illumination. However, this algorithm often fails in tracking objects if strong edge exists near the object. Replacement of the input image by an IR image guarantees robust object region extraction because background edges do not affect the IR image. In conclusion, the proposed gait recognition algorithm improves accuracy in object extraction by using IR images and the improvements finally increase the recognition rate of gaits.

  16. On the Performance of Stochastic Model-Based Image Segmentation

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Sewchand, Wilfred

    1989-11-01

    A new stochastic model-based image segmentation technique for X-ray CT image has been developed and has been extended to the more general nondiffraction CT images which include MRI, SPELT, and certain type of ultrasound images [1,2]. The nondiffraction CT image is modeled by a Finite Normal Mixture. The technique utilizes the information theoretic criterion to detect the number of the region images, uses the Expectation-Maximization algorithm to estimate the parameters of the image, and uses the Bayesian classifier to segment the observed image. How does this technique over/under-estimate the number of the region images? What is the probability of errors in the segmentation of this technique? This paper addresses these two problems and is a continuation of [1,2].

  17. Model-Based Systems Engineering Pilot Program at NASA Langley

    NASA Technical Reports Server (NTRS)

    Vipavetz, Kevin G.; Murphy, Douglas G.; Infeld, Samatha I.

    2012-01-01

    NASA Langley Research Center conducted a pilot program to evaluate the benefits of using a Model-Based Systems Engineering (MBSE) approach during the early phase of the Materials International Space Station Experiment-X (MISSE-X) project. The goal of the pilot was to leverage MBSE tools and methods, including the Systems Modeling Language (SysML), to understand the net gain of utilizing this approach on a moderate size flight project. The System Requirements Review (SRR) success criteria were used to guide the work products desired from the pilot. This paper discusses the pilot project implementation, provides SysML model examples, identifies lessons learned, and describes plans for further use on MBSE on MISSE-X.

  18. Model-based needle control in prostate percutaneous procedures.

    PubMed

    Maghsoudi, Arash; Jahed, Mehran

    2013-01-01

    In percutaneous applications, needle insertion into soft tissue is considered as a challenging procedure, and hence, it has been the subject of many recent studies. This study considers a model-based dynamics equation to evaluate the needle movement through prostate soft tissue. The proposed model estimates the applied force to the needle using the tissue deformation data and finite element model of the tissue. To address the role of mechanical properties of the soft tissue, an inverse dynamics control method based on sliding mode approach is used to demonstrate system performance in the presence of uncertainties. Furthermore, to deal with inaccurate estimation of mechanical parameters of the soft tissue, an adaptive controller is developed. Moreover, through a sensitivity analysis, it is shown that the uncertainty in the tissue mechanical parameters affects the system performance. Our results indicate that the adaptive controller approach performs slightly better than inverse dynamics method at the expense of fine-tuning the additional gain parameter. PMID:23516956

  19. An Evolutionary Model Based on Bit-String with Intelligence

    NASA Astrophysics Data System (ADS)

    He, Mingfeng; Pan, Qiuhui; Yu, Binglin

    An evolutionary model based on bit-strings with intelligence is set up in this paper. In this model, gene is divided into two parts which relative to health and intelligence. The accumulated intelligence influences the survival process by the effect of food and space restrictions. We modify the Verhulst factor to study this effect. Both asexual and sexual model are discussed in this paper. The results show that after many time steps, stability is reached and the population self-organizes, just like the standard Penna model. The intelligence made the equilibrium to be reached larger both in asexual model and sexual model. Compared with asexual model the population size fluctuates more strongly in the sexual model.

  20. Error Correction: Report on a Study

    ERIC Educational Resources Information Center

    Dabaghi, Azizollah

    2006-01-01

    This article reports on a study which investigated the effects of correction of learners' grammatical errors on acquisition. Specifically, it compared the effects of timing of correction (immediate versus delayed correction) and manner of correction (explicit versus implicit correction). It also investigated the relative effects of correction of…

  1. Model-Based Engineering and Manufacturing CAD/CAM Benchmark.

    SciTech Connect

    Domm, T.C.; Underwood, R.S.

    1999-10-13

    The Benchmark Project was created from a desire to identify best practices and improve the overall efficiency and performance of the Y-12 Plant's systems and personnel supporting the manufacturing mission. The mission of the benchmark team was to search out industry leaders in manufacturing and evaluate their engineering practices and processes to determine direction and focus for Y-12 modernization efforts. The companies visited included several large established companies and a new, small, high-tech machining firm. As a result of this effort, changes are recommended that will enable Y-12 to become a more modern, responsive, cost-effective manufacturing facility capable of supporting the needs of the Nuclear Weapons Complex (NWC) into the 21st century. The benchmark team identified key areas of interest, both focused and general. The focus areas included Human Resources, Information Management, Manufacturing Software Tools, and Standards/Policies and Practices. Areas of general interest included Infrastructure, Computer Platforms and Networking, and Organizational Structure. The results of this benchmark showed that all companies are moving in the direction of model-based engineering and manufacturing. There was evidence that many companies are trying to grasp how to manage current and legacy data. In terms of engineering design software tools, the companies contacted were somewhere between 3-D solid modeling and surfaced wire-frame models. The manufacturing computer tools were varied, with most companies using more than one software product to generate machining data and none currently performing model-based manufacturing (MBM) from a common model. The majority of companies were closer to identifying or using a single computer-aided design (CAD) system than a single computer-aided manufacturing (CAM) system. The Internet was a technology that all companies were looking to either transport information more easily throughout the corporation or as a conduit for

  2. Neural mass model-based tracking of anesthetic brain states.

    PubMed

    Kuhlmann, Levin; Freestone, Dean R; Manton, Jonathan H; Heyse, Bjorn; Vereecke, Hugo E M; Lipping, Tarmo; Struys, Michel M R F; Liley, David T J

    2016-06-01

    Neural mass model-based tracking of brain states from electroencephalographic signals holds the promise of simultaneously tracking brain states while inferring underlying physiological changes in various neuroscientific and clinical applications. Here, neural mass model-based tracking of brain states using the unscented Kalman filter applied to estimate parameters of the Jansen-Rit cortical population model is evaluated through the application of propofol-based anesthetic state monitoring. In particular, 15 subjects underwent propofol anesthesia induction from awake to anesthetised while behavioral responsiveness was monitored and frontal electroencephalographic signals were recorded. The unscented Kalman filter Jansen-Rit model approach applied to frontal electroencephalography achieved reasonable testing performance for classification of the anesthetic brain state (sensitivity: 0.51; chance sensitivity: 0.17; nearest neighbor sensitivity 0.75) when compared to approaches based on linear (autoregressive moving average) modeling (sensitivity 0.58; nearest neighbor sensitivity: 0.91) and a high performing standard depth of anesthesia monitoring measure, Higuchi Fractal Dimension (sensitivity: 0.50; nearest neighbor sensitivity: 0.88). Moreover, it was found that the unscented Kalman filter based parameter estimates of the inhibitory postsynaptic potential amplitude varied in the physiologically expected direction with increases in propofol concentration, while the estimates of the inhibitory postsynaptic potential rate constant did not. These results combined with analysis of monotonicity of parameter estimates, error analysis of parameter estimates, and observability analysis of the Jansen-Rit model, along with considerations of extensions of the Jansen-Rit model, suggests that the Jansen-Rit model combined with unscented Kalman filtering provides a valuable reference point for future real-time brain state tracking studies. This is especially true for studies of

  3. A new approach to account for the medium-dependent effect in model-based dose calculations for kilovoltage x-rays

    NASA Astrophysics Data System (ADS)

    Pawlowski, Jason M.; Ding, George X.

    2011-07-01

    This study presents a new approach to accurately account for the medium-dependent effect in model-based dose calculations for kilovoltage (kV) x-rays. This approach is based on the hypothesis that the correction factors needed to convert dose from model-based dose calculations to absorbed dose-to-medium depend on both the attenuation characteristics of the absorbing media and the changes to the energy spectrum of the incident x-rays as they traverse media with an effective atomic number different than that of water. Using Monte Carlo simulation techniques, we obtained empirical medium-dependent correction factors that take both effects into account. We found that the correction factors can be expressed as a function of a single quantity, called the effective bone depth, which is a measure of the amount of bone that an x-ray beam must penetrate to reach a voxel. Since the effective bone depth can be calculated from volumetric patient CT images, the medium-dependent correction factors can be obtained for model-based dose calculations based on patient CT images. We tested the accuracy of this new approach on 14 patients for the case of calculating imaging dose from kilovoltage cone-beam computed tomography used for patient setup in radiotherapy, and compared it with the Monte Carlo method, which is regarded as the 'gold standard'. For all patients studied, the new approach resulted in mean dose errors of less than 3%. This is in contrast to current available inhomogeneity corrected methods, which have been shown to result in mean errors of up to -103% for bone and 8% for soft tissue. Since there is a huge gain in the calculation speed relative to the Monte Carlo method (~two orders of magnitude) with an acceptable loss of accuracy, this approach provides an alternative accurate dose calculation method for kV x-rays.

  4. Systematic Outcomes Research for Corrections-Based Treatment: Implications from the Criminal Justice Kentucky Treatment Outcome Study

    ERIC Educational Resources Information Center

    Staton-Tindall, Michele; McNees, Erin; Leukefeld, Carl G.; Walker, Robert; Thompson, LaDonna; Pangburn, Kevin; Oser, Carrie B.

    2009-01-01

    Over the last four years, the Kentucky correctional system has expanded corrections-based modified therapeutic community treatment from 6 programs to 24 programs. To examine the effectiveness of these programs, the state initiated a systematic treatment outcome study known as the Criminal Justice Kentucky Treatment Outcome Study (CJKTOS). The…

  5. Therapeutic activity of modified U1 core spliceosomal particles

    PubMed Central

    Rogalska, Malgorzata Ewa; Tajnik, Mojca; Licastro, Danilo; Bussani, Erica; Camparini, Luca; Mattioli, Chiara; Pagani, Franco

    2016-01-01

    Modified U1 snRNAs bound to intronic sequences downstream of the 5′ splice site correct exon skipping caused by different types of mutations. Here we evaluate the therapeutic activity and structural requirements of these exon-specific U1 snRNA (ExSpeU1) particles. In a severe spinal muscular atrophy, mouse model, ExSpeU1, introduced by germline transgenesis, increases SMN2 exon 7 inclusion, SMN protein production and extends life span. In vitro, RNA mutant analysis and silencing experiments show that while U1A protein is dispensable, the 70K and stem loop IV elements mediate most of the splicing rescue activity through improvement of exon and intron definition. Our findings indicate that precise engineering of the U1 core spliceosomal RNA particle has therapeutic potential in pathologies associated with exon-skipping mutations. PMID:27041075

  6. Therapeutic activity of modified U1 core spliceosomal particles.

    PubMed

    Rogalska, Malgorzata Ewa; Tajnik, Mojca; Licastro, Danilo; Bussani, Erica; Camparini, Luca; Mattioli, Chiara; Pagani, Franco

    2016-01-01

    Modified U1 snRNAs bound to intronic sequences downstream of the 5' splice site correct exon skipping caused by different types of mutations. Here we evaluate the therapeutic activity and structural requirements of these exon-specific U1 snRNA (ExSpeU1) particles. In a severe spinal muscular atrophy, mouse model, ExSpeU1, introduced by germline transgenesis, increases SMN2 exon 7 inclusion, SMN protein production and extends life span. In vitro, RNA mutant analysis and silencing experiments show that while U1A protein is dispensable, the 70K and stem loop IV elements mediate most of the splicing rescue activity through improvement of exon and intron definition. Our findings indicate that precise engineering of the U1 core spliceosomal RNA particle has therapeutic potential in pathologies associated with exon-skipping mutations. PMID:27041075

  7. Diagnostic and therapeutic management of hepatocellular carcinoma

    PubMed Central

    Bellissimo, Francesco; Pinzone, Marilia Rita; Cacopardo, Bruno; Nunnari, Giuseppe

    2015-01-01

    Hepatocellular carcinoma (HCC) is an increasing health problem, representing the second cause of cancer-related mortality worldwide. The major risk factor for HCC is cirrhosis. In developing countries, viral hepatitis represent the major risk factor, whereas in developed countries, the epidemic of obesity, diabetes and nonalcoholic steatohepatitis contribute to the observed increase in HCC incidence. Cirrhotic patients are recommended to undergo HCC surveillance by abdominal ultrasounds at 6-mo intervals. The current diagnostic algorithms for HCC rely on typical radiological hallmarks in dynamic contrast-enhanced imaging, while the use of α-fetoprotein as an independent tool for HCC surveillance is not recommended by current guidelines due to its low sensitivity and specificity. Early diagnosis is crucial for curative treatments. Surgical resection, radiofrequency ablation and liver transplantation are considered the cornerstones of curative therapy, while for patients with more advanced HCC recommended options include sorafenib and trans-arterial chemo-embolization. A multidisciplinary team, consisting of hepatologists, surgeons, radiologists, oncologists and pathologists, is fundamental for a correct management. In this paper, we review the diagnostic and therapeutic management of HCC, with a focus on the most recent evidences and recommendations from guidelines. PMID:26576088

  8. Holographic thermalization with Weyl corrections

    NASA Astrophysics Data System (ADS)

    Dey, Anshuman; Mahapatra, Subhash; Sarkar, Tapobrata

    2016-01-01

    We consider holographic thermalization in the presence of a Weyl correction in five dimensional AdS space. We first obtain the Weyl corrected black brane solution perturbatively, up to first order in the coupling. The corresponding AdS-Vaidya like solution is then constructed. This is then used to numerically analyze the time dependence of the two point correlation functions and the expectation values of rectangular Wilson loops in the boundary field theory, and we discuss how the Weyl correction can modify the thermalization time scales in the dual field theory. In this context, the subtle interplay between the Weyl coupling constant and the chemical potential is studied in detail.

  9. Therapeutic approaches for celiac disease

    PubMed Central

    Plugis, Nicholas M.; Khosla, Chaitan

    2015-01-01

    Celiac disease is a common, lifelong autoimmune disorder for which dietary control is the only accepted form of therapy. A strict gluten-free diet is burdensome to patients and can be limited in efficacy, indicating there is an unmet need for novel therapeutic approaches to supplement or supplant dietary therapy. Many molecular events required for disease pathogenesis have been recently characterized and inspire most current and emerging drug-discovery efforts. Genome-wide association studies (GWAS) confirm the importance of human leukocyte antigen genes in our pathogenic model and identify a number of new risk loci in this complex disease. Here, we review the status of both emerging and potential therapeutic strategies in the context of disease pathophysiology. We conclude with a discussion of how genes identified during GWAS and follow-up studies that enhance susceptibility may offer insight into developing novel therapies. PMID:26060114

  10. Translating connexin biology into therapeutics.

    PubMed

    Becker, David L; Phillips, Anthony R; Duft, Bradford J; Kim, Yeri; Green, Colin R

    2016-02-01

    It is 45 years since gap junctions were first described. Universities face increasing commercial pressures and declining federal funding, with governments and funding foundations showing greater interest in gaining return on their investments. This review outlines approaches taken to translate gap junction research to clinical application and the challenges faced. The need for commercialisation is discussed and key concepts behind research patenting briefly described. Connexin channel roles in disease and injury are also discussed, as is identification of the connexin hemichannel as a therapeutic target which appears to play a role in both the start and perpetuation of the inflammasome pathway. Furthermore connexin hemichannel opening results in vascular dieback in acute injury and chronic disease. Translation to human indications is illustrated from the perspective of one connexin biotechnology company, CoDa Therapeutics, Inc. PMID:26688335

  11. [Therapeutic use of cannabis derivatives].

    PubMed

    Benyamina, Amine; Reynaud, Michel

    2014-02-01

    The therapeutic use of cannabis has generated a lot of interest in the past years, leading to a better understanding of its mechanisms of action. Countries like the United States and Canada have modified their laws in order to make cannabinoid use legal in the medical context. It's also the case in France now, where a recent decree was issued, authorizing the prescription of medication containing "therapeutic cannabis" (decree no. 2013-473, June 5, 2013). Cannabinoids such as dronabinol, Sativex and nabilone have been tested for the treatment of acute and chronic pain. These agents are most promising to relieve chronic pain associated with cancer, with human immunodeficiency virus infection and with multiple sclerosis. However, longer-term studies are required to determine potential long-term adverse effects and risks of misuse and addiction. PMID:24701869

  12. [Concept of the therapeutic community].

    PubMed

    Eichhorn, H

    1983-08-01

    The historic development of therapeutic communities is discussed, and it is shown that the term has been neither conceptualized not operationalized. Their unclear aims are considered to be utopian, and the author stresses that previous studies on such communities have been too superficial. The following problems have not hitherto received attention: 1. micro- and macrosocial relationships, 2. the role of the supervisor (authority problems), 3. norms and valuation systems, 4. discipline and sanctions, 5. the problem of roles, 6. questions of indicants and efficacy. The introduction of therapeutic communities is superfluous as a means of improving the socialist health services: it is sufficient to implement the principles of socialist democracy by means of appropriate training programmes. PMID:6635034

  13. Training assessors in Therapeutic Assessment.

    PubMed

    Haydel, Marianne E; Mercer, Barbara L; Rosenblatt, Erin

    2011-01-01

    This article focuses on the use of the comprehensive Therapeutic Assessment training model (Finn, 2007) with a child and his mother. The mother observed the child's testing sessions and was actively involved in a family intervention session as a way of translating assessment results into practice. One psychologist administered the psychological tests with the child, and 2 other clinicians worked with the mother throughout the process. We offer ideas about learning and training in the context of our case in Therapeutic Assessment. We investigate the parallel process between the way in which parents learn about their child's perspective and the way in which clinicians learn about the family's perspective. We discuss our discoveries in the context of planning case interventions. We explore the impact of trauma and ways of holding and containing this difficult work within our community and with each other. PMID:21184325

  14. Myc proteins as therapeutic targets

    PubMed Central

    Gustafson, WC; Weiss, WA

    2010-01-01

    Myc proteins (c-myc, Mycn and Mycl) target proliferative and apoptotic pathways vital for progression in cancer. Amplification of the MYCN gene has emerged as one of the clearest indicators of aggressive and chemotherapy-refractory disease in children with neuroblastoma, the most common extracranial solid tumor of childhood. Phosphorylation and ubiquitin-mediated modulation of Myc protein influence stability and represent potential targets for therapeutic intervention. Phosphorylation of Myc proteins is controlled in-part by the receptor tyrosine kinase/phosphatidylinositol 3-kinase/Akt/mTOR signaling, with additional contributions from Aurora A kinase. Myc proteins regulate apoptosis in part through interactions with the p53/Mdm2/Arf signaling pathway. Mutation in p53 is commonly observed in patients with relapsed neuroblastoma, contributing to both biology and therapeutic resistance. This review examines Myc function and regulation in neuroblastoma, and discusses emerging therapies that target Mycn. PMID:20101214

  15. Sinigrin and Its Therapeutic Benefits.

    PubMed

    Mazumder, Anisha; Dwivedi, Anupma; du Plessis, Jeanetta

    2016-01-01

    Sinigrin (allyl-glucosinolate or 2-propenyl-glucosinolate) is a natural aliphatic glucosinolate present in plants of the Brassicaceae family, such as broccoli and brussels sprouts, and the seeds of Brassica nigra (mustard seeds) which contain high amounts of sinigrin. Since ancient times, mustard has been used by mankind for its culinary, as well as medicinal, properties. It has been systematically described and evaluated in the classical Ayurvedic texts. Studies conducted on the pharmacological activities of sinigrin have revealed anti-cancer, antibacterial, antifungal, antioxidant, anti-inflammatory, wound healing properties and biofumigation. This current review will bring concise information about the known therapeutic activities of sinigrin. However, the information on known biological activities is very limited and, hence, further studies still need to be conducted and its molecular mechanisms also need to be explored. This review on the therapeutic benefits of sinigrin can summarize current knowledge about this unique phytocompounds. PMID:27043505

  16. Therapeutic cloning and tissue engineering.

    PubMed

    Koh, Chester J; Atala, Anthony

    2004-01-01

    A severe shortage of donor organs available for transplantation in the United States leaves patients suffering from diseased and injured organs with few treatment options. Scientists in the field of tissue engineering apply the principles of cell transplantation, material science, and engineering to construct biological substitutes that will restore and maintain normal function in diseased and injured tissues. Therapeutic cloning, where the nucleus from a donor cell is transferred into an enucleated oocyte in order to extract pluripotent embryonic stem cells, offers a potentially limitless source of cells for tissue engineering applications. The present chapter reviews recent advances that have occurred in therapeutic cloning and tissue engineering and describes applications of these new technologies that may offer novel therapies for patients with end-stage organ failure. PMID:15094294

  17. Atmospheric refractivity corrections in satellite laser ranging

    NASA Technical Reports Server (NTRS)

    Abshire, J. B.; Gardner, C. S.

    1985-01-01

    Atmospheric refraction can cause significant errors in satellite laser ranging (SLR) systems. There are two techniques which can be used to correct for these errors. Atmospheric models based upon surface measurements of pressure, temperature, and relative humidity have been shown by ray tracing to be accurate to within a few centimeters at 20 deg elevation angle. The residual errors in the models are thought to be primarily caused by horizontal gradients in the refractivity. Although models have been developed to predict the gradient effects, initial studies show that they can be sensitive to local topographic effects. Atmospheric turbulence can introduce random fluctuations in the refractivity, but only introduces centimeter level errors at elevation angles below 10 deg. Pulsed two-color ranging systems can directly measure the atmospheric delay in satellite ranging. These systems require mode-locked multiple-frequency lasers and streak-camera-based receivers and currently appear capable of measuring the atmospheric delay with an accuracy of 0.5 cm or better.

  18. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  19. Model-Based Diagnosis and Prognosis of a Water Recycling System

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Hafiychuk, Vasyl; Goebel, Kai Frank

    2013-01-01

    A water recycling system (WRS) deployed at NASA Ames Research Center s Sustainability Base (an energy efficient office building that integrates some novel technologies developed for space applications) will serve as a testbed for long duration testing of next generation spacecraft water recycling systems for future human spaceflight missions. This system cleans graywater (waste water collected from sinks and showers) and recycles it into clean water. Like all engineered systems, the WRS is prone to standard degradation due to regular use, as well as other faults. Diagnostic and prognostic applications will be deployed on the WRS to ensure its safe, efficient, and correct operation. The diagnostic and prognostic results can be used to enable condition-based maintenance to avoid unplanned outages, and perhaps extend the useful life of the WRS. Diagnosis involves detecting when a fault occurs, isolating the root cause of the fault, and identifying the extent of damage. Prognosis involves predicting when the system will reach its end of life irrespective of whether an abnormal condition is present or not. In this paper, first, we develop a physics model of both nominal and faulty system behavior of the WRS. Then, we apply an integrated model-based diagnosis and prognosis framework to the simulation model of the WRS for several different fault scenarios to detect, isolate, and identify faults, and predict the end of life in each fault scenario, and present the experimental results.

  20. Quantitative Bayesian model-based analysis of amide proton transfer MRI.

    PubMed

    Chappell, Michael A; Donahue, Manus J; Tee, Yee Kai; Khrapitchev, Alexandre A; Sibson, Nicola R; Jezzard, Peter; Payne, Stephen J

    2013-08-01

    Amide Proton Transfer (APT) reports on contrast derived from the exchange of protons between amide groups and water. Commonly, APT contrast is quantified by asymmetry analysis, providing an ensemble contrast of both amide proton concentration and exchange rate. An alternative is to sample the off-resonant spectrum and fit an exchange model, permitting the APT effect to be quantified, correcting automatically for confounding effects of spillover, field inhomogeneity, and magnetization transfer. Additionally, it should permit amide concentration and exchange rate to be independently quantified. Here, a Bayesian method is applied to this problem allowing pertinent prior information to be specified. A three-pool model was used incorporating water protons, amide protons, and magnetization transfer effect. The method is demonstrated in simulations, creatine phantoms with varying pH and in vivo (n = 7). The Bayesian model-based approach was able to quantify the APT effect accurately (root-mean-square error < 2%) even when subject to confounding field variation and magnetization transfer effect, unlike traditional asymmetry analysis. The in vivo results gave approximate APT concentration (relative to water) and exchange rate values of 3 × 10(-3) and 15 s(-1) . A degree of correlation was observed between these parameter making the latter difficult to quantify with absolute accuracy, suggesting that more optimal sampling strategies might be required. PMID:23008121

  1. Toward a Model-Based Approach to Flight System Fault Protection

    NASA Technical Reports Server (NTRS)

    Day, John; Murray, Alex; Meakin, Peter

    2012-01-01

    Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.

  2. An approach to model validation and model-based prediction -- polyurethane foam case study.

    SciTech Connect

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are

  3. Antibody Engineering and Therapeutics Conference

    PubMed Central

    Almagro, Juan Carlos; Gilliland, Gary L; Scott, Jamie; Larrick, James W; Plückthun, Andreas; Veldman, Trudi; Adams, Gregory P; Parren, Paul WHI; Chester, Kerry A; Bradbury, Andrew; Reichert, Janice M; Huston, James S

    2013-01-01

    The Antibody Engineering and Therapeutics conference, which serves as the annual meeting of The Antibody Society, will be held in Huntington Beach, CA from Sunday December 8 through Thursday December 12, 2013. The scientific program will cover the full spectrum of challenges in antibody research and development, and provide updates on recent progress in areas from basic science through approval of antibody therapeutics. Keynote presentations will be given by Leroy Hood (Institute of System Biology), who will discuss a systems approach for studying disease that is enabled by emerging technology; Douglas Lauffenburger (Massachusetts Institute of Technology), who will discuss systems analysis of cell communication network dynamics for therapeutic biologics design; David Baker (University of Washington), who will describe computer-based design of smart protein therapeutics; and William Schief (The Scripps Research Institute), who will discuss epitope-focused immunogen design.   In this preview of the conference, the workshop and session chairs share their thoughts on what conference participants may learn in sessions on: (1) three-dimensional structure antibody modeling; (2) identifying clonal lineages from next-generation data sets of expressed VH gene sequences; (3) antibodies in cardiometabolic medicine; (4) the effects of antibody gene variation and usage on the antibody response; (5) directed evolution; (6) antibody pharmacokinetics, distribution and off-target toxicity; (7) use of knowledge-based design to guide development of complementarity-determining regions and epitopes to engineer or elicit the desired antibody; (8) optimizing antibody formats for immunotherapy; (9) antibodies in a complex environment; (10) polyclonal, oligoclonal and bispecific antibodies; (11) antibodies to watch in 2014; and (12) polyreactive antibodies and polyspecificity.

  4. Targeting telomerase for cancer therapeutics

    PubMed Central

    Shay, J W; Keith, W N

    2008-01-01

    One of the hallmarks of advanced malignancies is continuous cell growth and this almost universally correlates with the reactivation of telomerase. Although there is still much we do not understand about the regulation of telomerase, it remains a very attractive and novel target for cancer therapeutics. Several clinical trials have been initiated, and in this review we highlight some of the most promising approaches and conclude by speculating on the role of telomerase in cancer stem cells. PMID:18231105

  5. Bioengineering Beige Adipose Tissue Therapeutics.

    PubMed

    Tharp, Kevin M; Stahl, Andreas

    2015-01-01

    Unlocking the therapeutic potential of brown/beige adipose tissue requires technological advancements that enable the controlled expansion of this uniquely thermogenic tissue. Transplantation of brown fat in small animal model systems has confirmed the expectation that brown fat expansion could possibly provide a novel therapeutic to combat obesity and related disorders. Expansion and/or stimulation of uncoupling protein-1 (UCP1)-positive adipose tissues have repeatedly demonstrated physiologically beneficial reductions in circulating glucose and lipids. The recent discovery that brown adipose tissue (BAT)-derived secreted factors positively alter whole body metabolism further expands potential benefits of brown or beige/brite adipose expansion. Unfortunately, there are no sources of transplantable BATs for human therapeutic purposes at this time. Recent developments in bioengineering, including novel hyaluronic acid-based hydrogels, have enabled non-immunogenic, functional tissue allografts that can be used to generate large quantities of UCP1-positive adipose tissue. These sophisticated tissue-engineering systems have provided the methodology to develop metabolically active brown or beige/brite adipose tissue implants with the potential to be used as a metabolic therapy. Unlike the pharmacological browning of white adipose depots, implantation of bioengineered UCP1-positive adipose tissues offers a spatially controlled therapeutic. Moving forward, new insights into the mechanisms by which extracellular cues govern stem-cell differentiation and progenitor cell recruitment may enable cell-free matrix implant approaches, which generate a niche sufficient to recruit white adipose tissue-derived stem cells and support their differentiation into functional beige/brite adipose tissues. This review summarizes clinically relevant discoveries in tissue-engineering and biology leading toward the recent development of biomaterial supported beige adipose tissue implants and

  6. Bioengineering Beige Adipose Tissue Therapeutics

    PubMed Central

    Tharp, Kevin M.; Stahl, Andreas

    2015-01-01

    Unlocking the therapeutic potential of brown/beige adipose tissue requires technological advancements that enable the controlled expansion of this uniquely thermogenic tissue. Transplantation of brown fat in small animal model systems has confirmed the expectation that brown fat expansion could possibly provide a novel therapeutic to combat obesity and related disorders. Expansion and/or stimulation of uncoupling protein-1 (UCP1)-positive adipose tissues have repeatedly demonstrated physiologically beneficial reductions in circulating glucose and lipids. The recent discovery that brown adipose tissue (BAT)-derived secreted factors positively alter whole body metabolism further expands potential benefits of brown or beige/brite adipose expansion. Unfortunately, there are no sources of transplantable BATs for human therapeutic purposes at this time. Recent developments in bioengineering, including novel hyaluronic acid-based hydrogels, have enabled non-immunogenic, functional tissue allografts that can be used to generate large quantities of UCP1-positive adipose tissue. These sophisticated tissue-engineering systems have provided the methodology to develop metabolically active brown or beige/brite adipose tissue implants with the potential to be used as a metabolic therapy. Unlike the pharmacological browning of white adipose depots, implantation of bioengineered UCP1-positive adipose tissues offers a spatially controlled therapeutic. Moving forward, new insights into the mechanisms by which extracellular cues govern stem-cell differentiation and progenitor cell recruitment may enable cell-free matrix implant approaches, which generate a niche sufficient to recruit white adipose tissue-derived stem cells and support their differentiation into functional beige/brite adipose tissues. This review summarizes clinically relevant discoveries in tissue-engineering and biology leading toward the recent development of biomaterial supported beige adipose tissue implants and

  7. Yessotoxin, a Promising Therapeutic Tool

    PubMed Central

    Alfonso, Amparo; Vieytes, Mercedes R.; Botana, Luis M.

    2016-01-01

    Yessotoxin (YTX) is a polyether compound produced by dinoflagellates and accumulated in filter feeding shellfish. No records about human intoxications induced by this compound have been published, however it is considered a toxin. Modifications in second messenger levels, protein levels, immune cells, cytoskeleton or activation of different cellular death types have been published as consequence of YTX exposure. This review summarizes the main intracellular pathways modulated by YTX and their pharmacological and therapeutic implications. PMID:26828502

  8. Functional Gene Correction for Cystic Fibrosis in Lung Epithelial Cells Generated from Patient iPSCs.

    PubMed

    Firth, Amy L; Menon, Tushar; Parker, Gregory S; Qualls, Susan J; Lewis, Benjamin M; Ke, Eugene; Dargitz, Carl T; Wright, Rebecca; Khanna, Ajai; Gage, Fred H; Verma, Inder M

    2015-09-01

    Lung disease is a major cause of death in the United States, with current therapeutic approaches serving only to manage symptoms. The most common chronic and life-threatening genetic disease of the lung is cystic fibrosis (CF) caused by mutations in the cystic fibrosis transmembrane regulator (CFTR). We have generated induced pluripotent stem cells (iPSCs) from CF patients carrying a homozygous deletion of F508 in the CFTR gene, which results in defective processing of CFTR to the cell membrane. This mutation was precisely corrected using CRISPR to target corrective sequences to the endogenous CFTR genomic locus, in combination with a completely excisable selection system, which significantly improved the efficiency of this correction. The corrected iPSCs were subsequently differentiated to mature airway epithelial cells where recovery of normal CFTR expression and function was demonstrated. This isogenic iPSC-based model system for CF could be adapted for the development of new therapeutic approaches. PMID:26299960

  9. Therapeutic Strategies in Huntington's Disease

    PubMed Central

    2006-01-01

    This article provides an overview of the therapeutic strategies, from ordinary classical drugs to the modern molecular strategy at experimental level, for Huntington's disease. The disease is characterized by choreic movements, psychiatric disorders, striatal atrophy with selective small neuronal loss, and autosomal dominant inheritance. The genetic abnormality is CAG expansion in huntingtin gene. Mutant huntingtin with abnormally long glutamine stretch aggregates and forms intranuclear inclusions. In this review, I summarize the results of previous trials from the following aspects; 1. symptomatic/palliative therapies including drugs, stereotaxic surgery and repetitive transcranial magnetic stimulation, 2. anti-degenerative therapies including anti-excitotoxicity, reversal of mitochondrial dysfunction and anti-apoptosis, 3. restorative/reparative therapies including neural trophic factors and tissue or stem cell transplantation, and 4. molecular targets in specific and radical therapies including inhibition of truncation of huntingtin, inhibition of aggregate formation, normalization of transcriptional dysregulation, enhancement of autophagic clearance of mutant huntingtin, and specific inhibition of huntingtin expression by sRNAi. Although the strategies mentioned in the latter two categories are mostly at laboratory level at present, we are pleased that one can discuss such "therapeutic strategies", a matter absolutely impossible before the causal gene of Huntington's disease was identified more than 10 years ago. It is also true, however, that some of the "therapeutic strategies" mentioned here would be found difficult to implement and abandoned in the future. PMID:20396523

  10. Avian Diagnostic and Therapeutic Antibodies

    SciTech Connect

    Bradley, David Sherman

    2012-12-31

    A number of infectious agents have the potential of causing significant clinical symptomology and even death, but dispite this, the number of incidence remain below the level that supports producing a vaccine. Therapeutic antibodies provide a viable treatment option for many of these diseases. We proposed that antibodies derived from West Nile Virus (WNV) immunized geese would be able to treat WNV infection in mammals and potential humans. We demonstrated that WNV specific goose antibodies are indeed successful in treating WNV infection both prophylactically and therapeutically in a golden hamster model. We demonstrated that the goose derived antibodies are non-reactogenic, i.e. do not cause an inflammatory response with multiple exposures in mammals. We also developed both a specific pathogen free facility to house the geese during the antibody production phase and a patent-pending purification process to purify the antibodies to greater than 99% purity. Therefore, the success of these study will allow a cost effective rapidly producible therapeutic toward clinical testing with the necessary infrastructure and processes developed and in place.

  11. Therapeutic Applications of Carbon Monoxide

    PubMed Central

    Knauert, Melissa; Vangala, Sandeep; Haslip, Maria; Lee, Patty J.

    2013-01-01

    Heme oxygenase-1 (HO-1) is a regulated enzyme induced in multiple stress states. Carbon monoxide (CO) is a product of HO catalysis of heme. In many circumstances, CO appears to functionally replace HO-1, and CO is known to have endogenous anti-inflammatory, anti-apoptotic, and antiproliferative effects. CO is well studied in anoxia-reoxygenation and ischemia-reperfusion models and has advanced to phase II trials for treatment of several clinical entities. In alternative injury models, laboratories have used sepsis, acute lung injury, and systemic inflammatory challenges to assess the ability of CO to rescue cells, organs, and organisms. Hopefully, the research supporting the protective effects of CO in animal models will translate into therapeutic benefits for patients. Preclinical studies of CO are now moving towards more complex damage models that reflect polymicrobial sepsis or two-step injuries, such as sepsis complicated by acute respiratory distress syndrome. Furthermore, co-treatment and post-treatment with CO are being explored in which the insult occurs before there is an opportunity to intervene therapeutically. The aim of this review is to discuss the potential therapeutic implications of CO with a focus on lung injury and sepsis-related models. PMID:24648866

  12. Conotoxins that confer therapeutic possibilities.

    PubMed

    Essack, Magbubah; Bajic, Vladimir B; Archer, John A C

    2012-06-01

    Cone snails produce a distinctive repertoire of venom peptides that are used both as a defense mechanism and also to facilitate the immobilization and digestion of prey. These peptides target a wide variety of voltage- and ligand-gated ion channels, which make them an invaluable resource for studying the properties of these ion channels in normal and diseased states, as well as being a collection of compounds of potential pharmacological use in their own right. Examples include the United States Food and Drug Administration (FDA) approved pharmaceutical drug, Ziconotide (Prialt(®); Elan Pharmaceuticals, Inc.) that is the synthetic equivalent of the naturally occurring ω-conotoxin MVIIA, whilst several other conotoxins are currently being used as standard research tools and screened as potential therapeutic drugs in pre-clinical or clinical trials. These developments highlight the importance of driving conotoxin-related research. A PubMed query from 1 January 2007 to 31 August 2011 combined with hand-curation of the retrieved articles allowed for the collation of 98 recently identified conotoxins with therapeutic potential which are selectively discussed in this review. Protein sequence similarity analysis tentatively assigned uncharacterized conotoxins to predicted functional classes. Furthermore, conotoxin therapeutic potential for neurodegenerative disorders (NDD) was also inferred. PMID:22822370

  13. Model-based cartilage thickness measurement in the submillimeter range

    SciTech Connect

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-09-15

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  14. Biased Randomized Algorithm for Fast Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Williams, Colin; Vartan, Farrokh

    2005-01-01

    A biased randomized algorithm has been developed to enable the rapid computational solution of a propositional- satisfiability (SAT) problem equivalent to a diagnosis problem. The closest competing methods of automated diagnosis are described in the preceding article "Fast Algorithms for Model-Based Diagnosis" and "Two Methods of Efficient Solution of the Hitting-Set Problem" (NPO-30584), which appears elsewhere in this issue. It is necessary to recapitulate some of the information from the cited articles as a prerequisite to a description of the present method. As used here, "diagnosis" signifies, more precisely, a type of model-based diagnosis in which one explores any logical inconsistencies between the observed and expected behaviors of an engineering system. The function of each component and the interconnections among all the components of the engineering system are represented as a logical system. Hence, the expected behavior of the engineering system is represented as a set of logical consequences. Faulty components lead to inconsistency between the observed and expected behaviors of the system, represented by logical inconsistencies. Diagnosis - the task of finding the faulty components - reduces to finding the components, the abnormalities of which could explain all the logical inconsistencies. One seeks a minimal set of faulty components (denoted a minimal diagnosis), because the trivial solution, in which all components are deemed to be faulty, always explains all inconsistencies. In the methods of the cited articles, the minimal-diagnosis problem is treated as equivalent to a minimal-hitting-set problem, which is translated from a combinatorial to a computational problem by mapping it onto the Boolean-satisfiability and integer-programming problems. The integer-programming approach taken in one of the prior methods is complete (in the sense that it is guaranteed to find a solution if one exists) and slow and yields a lower bound on the size of the

  15. Yoga school of thought and psychiatry: Therapeutic potential.

    PubMed

    Rao, Naren P; Varambally, Shivarama; Gangadhar, Bangalore N

    2013-01-01

    Yoga is a traditional life-style practice used for spiritual reasons. However, the physical components like the asanas and pranayaamas have demonstrated physiological and therapeutic effects. There is evidence for Yoga as being a potent antidepressant that matches with drugs. In depressive disorder, yoga 'corrects' an underlying cognitive physiology. In schizophrenia patients, yoga has benefits as an add-on intervention in pharmacologically stabilized subjects. The effects are particularly notable on negative symptoms. Yoga also helps to correct social cognition. Yoga can be introduced early in the treatment of psychosis with some benefits. Elevation of oxytocin may be a mechanism of yoga effects in schizophrenia. Certain components of yoga have demonstrated neurobiological effects similar to those of vagal stimulation, indicating this (indirect or autogenous vagal stimulation) as a possible mechanism of its action. It is time, psychiatrists exploited the benefits if yoga for a comprehensive care in their patients. PMID:23858245

  16. Coupling Correction Study at NSRRC

    SciTech Connect

    Safranek, James

    2003-07-29

    Emittance coupling between vertical and horizontal planes at TLS has been investigated. Using a set of skew quadrupoles, the coupling can be corrected to an acceptable value. The coupling sources are studied and possible errors are reduced.

  17. New Therapeutic Approaches to Modulate and Correct Cystic Fibrosis Transmembrane Conductance Regulator.

    PubMed

    Ong, Thida; Ramsey, Bonnie W

    2016-08-01

    Cystic fibrosis transmembrane conductance regulator (CFTR) modulators are clinically available personalized medicines approved for some individuals with cystic fibrosis (CF) to target the underlying defect of disease. This review summarizes strategies used to develop CFTR modulators as therapies that improve function and availability of CFTR protein. Lessons learned from dissemination of ivacaftor across the CF population responsive to this therapy and future approaches to predict and monitor treatment response of CFTR modulators are discussed. The goal remains to expand patient-centered and personalized therapy to all patients with CF, ultimately improving life expectancy and quality of life for this disease. PMID:27469186

  18. Correction of the crooked nose.

    PubMed

    Potter, Jason K

    2012-02-01

    Correction of the deviated nose is one of the most difficult tasks in rhinoplasty surgery and should be approached in a systematic manner to ensure a satisfied patient and surgeon. Correction of the deviated nose is unique in that the patient's complaints frequently include aesthetic and functional characteristics. Equal importance should be given to the preoperative, intraoperative, and postoperative aspects of the patient's treatment to ensure a favorable outcome. PMID:22284400

  19. Neonatal Encephalopathy: Update on Therapeutic Hypothermia and Other Novel Therapeutics.

    PubMed

    McAdams, Ryan M; Juul, Sandra E

    2016-09-01

    Neonatal encephalopathy (NE) is a major cause of neonatal mortality and morbidity. Therapeutic hypothermia (TH) is standard treatment for newborns at 36 weeks of gestation or greater with intrapartum hypoxia-related NE. Term and late preterm infants with moderate to severe encephalopathy show improved survival and neurodevelopmental outcomes at 18 months of age after TH. TH can increase survival without increasing major disability, rates of an IQ less than 70, or cerebral palsy. Neonates with severe NE remain at risk of death or severe neurodevelopmental impairment. This review discusses the evidence supporting TH for term or near term neonates with NE. PMID:27524449

  20. An Accurate Temperature Correction Model for Thermocouple Hygrometers 1

    PubMed Central

    Savage, Michael J.; Cass, Alfred; de Jager, James M.

    1982-01-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques. In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38°C). The model based on calibration at two temperatures is superior to that based on only one calibration. The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25°C, if the calibration slopes are corrected for temperature. PMID:16662241

  1. An accurate temperature correction model for thermocouple hygrometers.

    PubMed

    Savage, M J; Cass, A; de Jager, J M

    1982-02-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques.In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38 degrees C). The model based on calibration at two temperatures is superior to that based on only one calibration.The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25 degrees C, if the calibration slopes are corrected for temperature. PMID:16662241

  2. Optimization of Raman-spectrum baseline correction in biological application.

    PubMed

    Guo, Shuxia; Bocklitz, Thomas; Popp, Jürgen

    2016-04-21

    In the last decade Raman-spectroscopy has become an invaluable tool for biomedical diagnostics. However, a manual rating of the subtle spectral differences between normal and abnormal disease states is not possible or practical. Thus it is necessary to combine Raman-spectroscopy with chemometrics in order to build statistical models predicting the disease states directly without manual intervention. Within chemometrical analysis a number of corrections have to be applied to receive robust models. Baseline correction is an important step of the pre-processing, which should remove spectral contributions of fluorescence effects and improve the performance and robustness of statistical models. However, it is demanding, time-consuming, and depends on expert knowledge to select an optimal baseline correction method and its parameters every time working with a new dataset. To circumvent this issue we proposed a genetic algorithm based method to automatically optimize the baseline correction. The investigation was carried out in three main steps. Firstly, a numerical quantitative marker was defined to evaluate the baseline estimation quality. Secondly, a genetic algorithm based methodology was established to search the optimal baseline estimation with the defined quantitative marker as evaluation function. Finally, classification models were utilized to benchmark the performance of the optimized baseline. For comparison, model based baseline optimization was carried out applying the same classifiers. It was proven that our method could provide a semi-optimal and stable baseline estimation without any chemical knowledge required or any additional spectral information used. PMID:26907832

  3. Tempest in a Therapeutic Community: Implementation and Evaluation Issues for Faith-Based Programming

    ERIC Educational Resources Information Center

    Scott, Diane L.; Crow, Matthew S.; Thompson, Carla J.

    2010-01-01

    The therapeutic community (TC) is an increasingly utilized intervention model in corrections settings. Rarely do these TCs include faith-based curriculum other than that included in Alcoholics Anonymous or Narcotics Anonymous programs as does the faith-based TC that serves as the basis for this article. Borrowing from the successful TC model, the…

  4. Development of land surface reflectance models based on multiscale simulation

    NASA Astrophysics Data System (ADS)

    Goodenough, Adam A.; Brown, Scott D.

    2015-05-01

    Modeling and simulation of Earth imaging sensors with large spatial coverage necessitates an understanding of how photons interact with individual land surface processes at an aggregate level. For example, the leaf angle distribution of a deciduous forest canopy has a significant impact on the path of a single photon as it is scattered among the leaves and, consequently, a significant impact on the observed bidirectional reflectance distribution function (BRDF) of the canopy as a whole. In particular, simulation of imagery of heterogeneous scenes for many multispectral/hyperspectral applications requires detailed modeling of regions of the spectrum where many orders of scattering are required due to both high reflectance and transmittance. Radiative transfer modeling based on ray tracing, hybrid Monte Carlo techniques and detailed geometric and optical models of land cover means that it is possible to build effective, aggregate optical models with parameters such as species, spatial distribution, and underlying terrain variation. This paper examines the capability of the Digital Image and Remote Sensing Image Generation (DIRSIG) model to generate BRDF data representing land surfaces at large scale from modeling at a much smaller scale. We describe robust methods for generating optical property models effectively in DIRSIG and present new tools for facilitating the process. The methods and results for forest canopies are described relative to the RAdiation transfer Model Intercomparison (RAMI) benchmark scenes, which also forms the basis for an evaluation of the approach. Additional applications and examples are presented, representing different types of land cover.

  5. Advanced electron crystallography through model-based imaging.

    PubMed

    Van Aert, Sandra; De Backer, Annick; Martinez, Gerardo T; den Dekker, Arnold J; Van Dyck, Dirk; Bals, Sara; Van Tendeloo, Gustaaf

    2016-01-01

    The increasing need for precise determination of the atomic arrangement of non-periodic structures in materials design and the control of nanostructures explains the growing interest in quantitative transmission electron microscopy. The aim is to extract precise and accurate numbers for unknown structure parameters including atomic positions, chemical concentrations and atomic numbers. For this purpose, statistical parameter estimation theory has been shown to provide reliable results. In this theory, observations are considered purely as data planes, from which structure parameters have to be determined using a parametric model describing the images. As such, the positions of atom columns can be measured with a precision of the order of a few picometres, even though the resolution of the electron microscope is still one or two orders of magnitude larger. Moreover, small differences in average atomic number, which cannot be distinguished visually, can be quantified using high-angle annular dark-field scanning transmission electron microscopy images. In addition, this theory allows one to measure compositional changes at interfaces, to count atoms with single-atom sensitivity, and to reconstruct atomic structures in three dimensions. This feature article brings the reader up to date, summarizing the underlying theory and highlighting some of the recent applications of quantitative model-based transmisson electron microscopy. PMID:26870383

  6. Automatic sensor placement for model-based robot vision.

    PubMed

    Chen, S Y; Li, Y F

    2004-02-01

    This paper presents a method for automatic sensor placement for model-based robot vision. In such a vision system, the sensor often needs to be moved from one pose to another around the object to observe all features of interest. This allows multiple three-dimensional (3-D) images to be taken from different vantage viewpoints. The task involves determination of the optimal sensor placements and a shortest path through these viewpoints. During the sensor planning, object features are resampled as individual points attached with surface normals. The optimal sensor placement graph is achieved by a genetic algorithm in which a min-max criterion is used for the evaluation. A shortest path is determined by Christofides algorithm. A Viewpoint Planner is developed to generate the sensor placement plan. It includes many functions, such as 3-D animation of the object geometry, sensor specification, initialization of the viewpoint number and their distribution, viewpoint evolution, shortest path computation, scene simulation of a specific viewpoint, parameter amendment. Experiments are also carried out on a real robot vision system to demonstrate the effectiveness of the proposed method. PMID:15369081

  7. Relativistic mean field model based on realistic nuclear forces

    SciTech Connect

    Hirose, S.; Serra, M.; Ring, P.; Otsuka, T.; Akaishi, Y.

    2007-02-15

    In order to predict properties of asymmetric nuclear matter, we construct a relativistic mean field (RMF) model consisting of one-meson exchange (OME) terms and point coupling (PC) terms. In order to determine the density dependent parameters of this model, we use properties of isospin symmetric nuclear matter in combination with the information on nucleon-nucleon scattering data, which are given in the form of the density dependent G-matrix derived from Brueckner calculations based on the Tamagaki potential. We show that the medium- and long-range components of this G-matrix can be described reasonably well by our effective OME interaction. In order to take into account the short-range part of the nucleon-nucleon interaction, which cannot be described well in this manner, a point coupling term is added. Its analytical form is taken from a model based on chiral perturbation theory. It contains only one additional parameter, which does not depend on the density. It is, together with the parameters of the OME potentials adjusted to the equation of state of symmetric nuclear matter. We apply this model for the investigation of asymmetric nuclear matter and find that the results for the symmetry energy as well as for the equation of state of pure neutron matter are in good agreement with either experimental data or with presently adopted theoretical predictions. In order to test the model at higher density, we use its equation of state for an investigation of properties of neutron stars.

  8. Propagating uncertainties in statistical model based shape prediction

    NASA Astrophysics Data System (ADS)

    Syrkina, Ekaterina; Blanc, Rémi; Székely, Gàbor

    2011-03-01

    This paper addresses the question of accuracy assessment and confidence regions estimation in statistical model based shape prediction. Shape prediction consists in estimating the shape of an organ based on a partial observation, due e.g. to a limited field of view or poorly contrasted images, and generally requires a statistical model. However, such predictions can be impaired by several sources of uncertainty, in particular the presence of noise in the observation, limited correlations between the predictors and the shape to predict, as well as limitations of the statistical shape model - in particular the number of training samples. We propose a framework which takes these into account and derives confidence regions around the predicted shape. Our method relies on the construction of two separate statistical shape models, for the predictors and for the unseen parts, and exploits the correlations between them assuming a joint Gaussian distribution. Limitations of the models are taken into account by jointly optimizing the prediction and minimizing the shape reconstruction error through cross-validation. An application to the prediction of the shape of the proximal part of the human tibia given the shape of the distal femur is proposed, as well as the evaluation of the reliability of the estimated confidence regions, using a database of 184 samples. Potential applications are reconstructive surgery, e.g. to assess whether an implant fits in a range of acceptable shapes, or functional neurosurgery when the target's position is not directly visible and needs to be inferred from nearby visible structures.

  9. Lithium battery aging model based on Dakin's degradation approach

    NASA Astrophysics Data System (ADS)

    Baghdadi, Issam; Briat, Olivier; Delétage, Jean-Yves; Gyan, Philippe; Vinassa, Jean-Michel

    2016-09-01

    This paper proposes and validates a calendar and power cycling aging model for two different lithium battery technologies. The model development is based on previous SIMCAL and SIMSTOCK project data. In these previous projects, the effect of the battery state of charge, temperature and current magnitude on aging was studied on a large panel of different battery chemistries. In this work, data are analyzed using Dakin's degradation approach. In fact, the logarithms of battery capacity fade and the increase in resistance evolves linearly over aging. The slopes identified from straight lines correspond to battery aging rates. Thus, a battery aging rate expression function of aging factors was deduced and found to be governed by Eyring's law. The proposed model simulates the capacity fade and resistance increase as functions of the influencing aging factors. Its expansion using Taylor series was consistent with semi-empirical models based on the square root of time, which are widely studied in the literature. Finally, the influence of the current magnitude and temperature on aging was simulated. Interestingly, the aging rate highly increases with decreasing and increasing temperature for the ranges of -5 °C-25 °C and 25 °C-60 °C, respectively.

  10. Model based condition monitoring in lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Singh, Amardeep; Izadian, Afshin; Anwar, Sohel

    2014-12-01

    In this paper, a model based condition monitoring technique is developed for lithium-ion battery condition monitoring. Here a number of lithium-ion batteries are cycled using two separate over discharge test regimes and the resulting shift in battery parameters is recorded. The battery models are constructed using the equivalent circuit methodology. The condition monitoring setup consists of a model bank representing the different degree of parameter shift due to overdischarge in the lithium ion battery. Extended Kalman filters (EKF) are used to maintain increased robustness of the condition monitoring setup while estimating the terminal voltage of the battery cell. The information carrying residuals are generated and evaluation process is carried out in real-time using multiple model adaptive estimation (MMAE) methodology. The condition evaluation function is used to generate probabilities that indicate the presence of a particular operational condition. Using the test data, it is shown that the performance shift in lithium ion batteries due to over discharge can be accurately detected.

  11. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  12. Femtosecond molecular dynamics of tautomerization in model base pairs

    NASA Astrophysics Data System (ADS)

    Douhal, A.; Kim, S. K.; Zewail, A. H.

    1995-11-01

    HYDROGEN bonds commonly lend robustness and directionality to molecular recognition processes and supramolecular structures1,2. In particular, the two or three hydrogen bonds in Watson-Crick base pairs bind the double-stranded DNA helix and determine the complementarity of the pairing. Watson and Crick pointed out3, however, that the possible tautomers of base pairs, in which hydrogen atoms become attached to the donor atom of the hydrogen bond, might disturb the genetic code, as the tautomer is capable of pairing with different partners. But the dynamics of hydrogen bonds in general, and of this tautomerization process in particular, are not well understood. Here we report observations of the femtosecond dynamics of tautomerization in model base pairs (7-azaindole dimers) containing two hydrogen bonds. Because of the femtosecond resolution of proton motions, we are able to examine the cooperativity of formation of the tautomer (in which the protons on each base are shifted sequentially to the other base), and to determine the characteristic timescales of the motions in a solvent-free environment. We find that the first step occurs on a timescale of a few hundred femtoseconds, whereas the second step, to form the full tautomer, is much slower, taking place within several picoseconds; the timescales are changed significantly by replacing hydrogen with deuterium. These results establish the molecular basis of the dynamics and the role of quantum tunnelling.

  13. Model-Based Material Parameter Estimation for Terahertz Reflection Spectroscopy

    NASA Astrophysics Data System (ADS)

    Kniffin, Gabriel Paul

    Many materials such as drugs and explosives have characteristic spectral signatures in the terahertz (THz) band. These unique signatures imply great promise for spectral detection and classification using THz radiation. While such spectral features are most easily observed in transmission, real-life imaging systems will need to identify materials of interest from reflection measurements, often in non-ideal geometries. One important, yet commonly overlooked source of signal corruption is the etalon effect -- interference phenomena caused by multiple reflections from dielectric layers of packaging and clothing likely to be concealing materials of interest in real-life scenarios. This thesis focuses on the development and implementation of a model-based material parameter estimation technique, primarily for use in reflection spectroscopy, that takes the influence of the etalon effect into account. The technique is adapted from techniques developed for transmission spectroscopy of thin samples and is demonstrated using measured data taken at the Northwest Electromagnetic Research Laboratory (NEAR-Lab) at Portland State University. Further tests are conducted, demonstrating the technique's robustness against measurement noise and common sources of error.

  14. Tyre pressure monitoring using a dynamical model-based estimator

    NASA Astrophysics Data System (ADS)

    Reina, Giulio; Gentile, Angelo; Messina, Arcangelo

    2015-04-01

    In the last few years, various control systems have been investigated in the automotive field with the aim of increasing the level of safety and stability, avoid roll-over, and customise handling characteristics. One critical issue connected with their integration is the lack of state and parameter information. As an example, vehicle handling depends to a large extent on tyre inflation pressure. When inflation pressure drops, handling and comfort performance generally deteriorate. In addition, it results in an increase in fuel consumption and in a decrease in lifetime. Therefore, it is important to keep tyres within the normal inflation pressure range. This paper introduces a model-based approach to estimate online tyre inflation pressure. First, basic vertical dynamic modelling of the vehicle is discussed. Then, a parameter estimation framework for dynamic analysis is presented. Several important vehicle parameters including tyre inflation pressure can be estimated using the estimated states. This method aims to work during normal driving using information from standard sensors only. On the one hand, the driver is informed about the inflation pressure and he is warned for sudden changes. On the other hand, accurate estimation of the vehicle states is available as possible input to onboard control systems.

  15. Model-based sound synthesis of the guqin.

    PubMed

    Penttinen, Henri; Pakarinen, Jyri; Välimäki, Vesa; Laurson, Mikael; Li, Henbing; Leman, Marc

    2006-12-01

    This paper presents a model-based sound synthesis algorithm for the Chinese plucked string instrument called the guqin. The instrument is fretless, which enables smooth pitch glides from one note to another. A version of the digital waveguide synthesis approach is used, where the string length is time-varying and its energy is scaled properly. A body model filter is placed in cascade with the string model. Flageolet tones are synthesized with the so-called ripple filter structure, which is an FIR comb filter in the delay line of a digital waveguide model. In addition, signal analysis of recorded guqin tones is presented. Friction noise produced by gliding the finger across the soundboard has a harmonic structure and is proportional to the gliding speed. For pressed tones, one end of a vibrating string is terminated either by the nail of the thumb or a fingertip. The tones terminated with a fingertip decay faster than those terminated with a thumb. Guqin tones are slightly inharmonic and they exhibit phantom partials. The synthesis model takes into account these characteristic features of the instrument and is able to reproduce them. The synthesis model will be used for rule based synthesis of guqin music. PMID:17225431

  16. Advanced electron crystallography through model-based imaging

    PubMed Central

    Van Aert, Sandra; De Backer, Annick; Martinez, Gerardo T.; den Dekker, Arnold J.; Van Dyck, Dirk; Bals, Sara; Van Tendeloo, Gustaaf

    2016-01-01

    The increasing need for precise determination of the atomic arrangement of non-periodic structures in materials design and the control of nanostructures explains the growing interest in quantitative transmission electron microscopy. The aim is to extract precise and accurate numbers for unknown structure parameters including atomic positions, chemical concentrations and atomic numbers. For this purpose, statistical parameter estimation theory has been shown to provide reliable results. In this theory, observations are considered purely as data planes, from which structure parameters have to be determined using a parametric model describing the images. As such, the positions of atom columns can be measured with a precision of the order of a few picometres, even though the resolution of the electron microscope is still one or two orders of magnitude larger. Moreover, small differences in average atomic number, which cannot be distinguished visually, can be quantified using high-angle annular dark-field scanning transmission electron microscopy images. In addition, this theory allows one to measure compositional changes at interfaces, to count atoms with single-atom sensitivity, and to reconstruct atomic structures in three dimensions. This feature article brings the reader up to date, summarizing the underlying theory and highlighting some of the recent applications of quantitative model-based transmisson electron microscopy. PMID:26870383

  17. Model-Based Recursive Partitioning for Subgroup Analyses.

    PubMed

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease. PMID:27227717

  18. Improved knowledge diffusion model based on the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo

    2015-06-01

    The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.

  19. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  20. Electrochemical model based charge optimization for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Pramanik, Sourav; Anwar, Sohel

    2016-05-01

    In this paper, we propose the design of a novel optimal strategy for charging the lithium-ion battery based on electrochemical battery model that is aimed at improved performance. A performance index that aims at minimizing the charging effort along with a minimum deviation from the rated maximum thresholds for cell temperature and charging current has been defined. The method proposed in this paper aims at achieving a faster charging rate while maintaining safe limits for various battery parameters. Safe operation of the battery is achieved by including the battery bulk temperature as a control component in the performance index which is of critical importance for electric vehicles. Another important aspect of the performance objective proposed here is the efficiency of the algorithm that would allow higher charging rates without compromising the internal electrochemical kinetics of the battery which would prevent abusive conditions, thereby improving the long term durability. A more realistic model, based on battery electro-chemistry has been used for the design of the optimal algorithm as opposed to the conventional equivalent circuit models. To solve the optimization problem, Pontryagins principle has been used which is very effective for constrained optimization problems with both state and input constraints. Simulation results show that the proposed optimal charging algorithm is capable of shortening the charging time of a lithium ion cell while maintaining the temperature constraint when compared with the standard constant current charging. The designed method also maintains the internal states within limits that can avoid abusive operating conditions.

  1. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  2. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  3. Measuring neuronal branching patterns using model-based approach.

    PubMed

    Luczak, Artur

    2010-01-01

    Neurons have complex branching systems which allow them to communicate with thousands of other neurons. Thus understanding neuronal geometry is clearly important for determining connectivity within the network and how this shapes neuronal function. One of the difficulties in uncovering relationships between neuronal shape and its function is the problem of quantifying complex neuronal geometry. Even by using multiple measures such as: dendritic length, distribution of segments, direction of branches, etc, a description of three dimensional neuronal embedding remains incomplete. To help alleviate this problem, here we propose a new measure, a shape diffusiveness index (SDI), to quantify spatial relations between branches at the local and global scale. It was shown that growth of neuronal trees can be modeled by using diffusion limited aggregation (DLA) process. By measuring "how easy" it is to reproduce the analyzed shape by using the DLA algorithm it can be measured how "diffusive" is that shape. Intuitively, "diffusiveness" measures how tree-like is a given shape. For example shapes like an oak tree will have high values of SDI. This measure is capturing an important feature of dendritic tree geometry, which is difficult to assess with other measures. This approach also presents a paradigm shift from well-defined deterministic measures to model-based measures, which estimate how well a model with specific properties can account for features of analyzed shape. PMID:21079752

  4. PARALLELISATION OF THE MODEL-BASED ITERATIVE RECONSTRUCTION ALGORITHM DIRA.

    PubMed

    Örtenberg, A; Magnusson, M; Sandborg, M; Alm Carlsson, G; Malusek, A

    2016-06-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelisation of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelisation of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelised using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelisation of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelisation with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. PMID:26454270

  5. Connectionist model-based stereo vision for telerobotics

    NASA Technical Reports Server (NTRS)

    Hoff, William; Mathis, Donald

    1989-01-01

    Autonomous stereo vision for range measurement could greatly enhance the performance of telerobotic systems. Stereo vision could be a key component for autonomous object recognition and localization, thus enabling the system to perform low-level tasks, and allowing a human operator to perform a supervisory role. The central difficulty in stereo vision is the ambiguity in matching corresponding points in the left and right images. However, if one has a priori knowledge of the characteristics of the objects in the scene, as is often the case in telerobotics, a model-based approach can be taken. Researchers describe how matching ambiguities can be resolved by ensuring that the resulting three-dimensional points are consistent with surface models of the expected objects. A four-layer neural network hierarchy is used in which surface models of increasing complexity are represented in successive layers. These models are represented using a connectionist scheme called parameter networks, in which a parametrized object (for example, a planar patch p=f(h,m sub x, m sub y) is represented by a collection of processing units, each of which corresponds to a distinct combination of parameter values. The activity level of each unit in the parameter network can be thought of as representing the confidence with which the hypothesis represented by that unit is believed. Weights in the network are set so as to implement gradient descent in an energy function.

  6. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  7. PACS model based on digital watermarking and its core algorithms

    NASA Astrophysics Data System (ADS)

    Que, Dashun; Wen, Xianlin; Chen, Bi

    2009-10-01

    PACS model based on digital watermarking is proposed by analyzing medical image features and PACS requirements from the point of view of information security, its core being digital watermarking server and the corresponding processing module. Two kinds of digital watermarking algorithm are studied; one is non-region of interest (NROI) digital watermarking algorithm based on wavelet domain and block-mean, the other is reversible watermarking algorithm on extended difference and pseudo-random matrix. The former belongs to robust lossy watermarking, which embedded in NROI by wavelet provides a good way for protecting the focus area (ROI) of images, and introduction of block-mean approach a good scheme to enhance the anti-attack capability; the latter belongs to fragile lossless watermarking, which has the performance of simple implementation and can realize tamper localization effectively, and the pseudo-random matrix enhances the correlation and security between pixels. Plenty of experimental research has been completed in this paper, including the realization of digital watermarking PACS model, the watermarking processing module and its anti-attack experiments, the digital watermarking server and the network transmission simulating experiments of medical images. Theoretical analysis and experimental results show that the designed PACS model can effectively ensure confidentiality, authenticity, integrity and security of medical image information.

  8. Discrete-Time ARMAv Model-Based Optimal Sensor Placement

    SciTech Connect

    Song Wei; Dyke, Shirley J.

    2008-07-08

    This paper concentrates on the optimal sensor placement problem in ambient vibration based structural health monitoring. More specifically, the paper examines the covariance of estimated parameters during system identification using auto-regressive and moving average vector (ARMAv) model. By utilizing the discrete-time steady state Kalman filter, this paper realizes the structure's finite element (FE) model under broad-band white noise excitations using an ARMAv model. Based on the asymptotic distribution of the parameter estimates of the ARMAv model, both a theoretical closed form and a numerical estimate form of the covariance of the estimates are obtained. Introducing the information entropy (differential entropy) measure, as well as various matrix norms, this paper attempts to find a reasonable measure to the uncertainties embedded in the ARMAv model estimates. Thus, it is possible to select the optimal sensor placement that would lead to the smallest uncertainties during the ARMAv identification process. Two numerical examples are provided to demonstrate the methodology and compare the sensor placement results upon various measures.

  9. Model-Based Reasoning in Upper-division Lab Courses

    NASA Astrophysics Data System (ADS)

    Lewandowski, Heather

    2015-05-01

    Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.

  10. An Opinion Interactive Model Based on Individual Persuasiveness

    PubMed Central

    Zhou, Xin; Chen, Bin; Liu, Liang; Ma, Liang; Qiu, Xiaogang

    2015-01-01

    In order to study the formation process of group opinion in real life, we put forward a new opinion interactive model based on Deffuant model and its improved models in this paper because current models of opinion dynamics lack considering individual persuasiveness. Our model has following advantages: firstly persuasiveness is added to individual's attributes reflecting the importance of persuasiveness, which means that all the individuals are different from others; secondly probability is introduced in the course of interaction which simulates the uncertainty of interaction. In Monte Carlo simulation experiments, sensitivity analysis including the influence of randomness, initial persuasiveness distribution, and number of individuals is studied at first; what comes next is that the range of common opinion based on the initial persuasiveness distribution can be predicted. Simulation experiment results show that when the initial values of agents are fixed, no matter how many times independently replicated experiments, the common opinion will converge at a certain point; however the number of iterations will not always be the same; the range of common opinion can be predicted when initial distribution of opinion and persuasiveness are given. As a result, this model can reflect and interpret some phenomena of opinion interaction in realistic society. PMID:26508911

  11. Complement involvement in periodontitis: molecular mechanisms and rational therapeutic approaches

    PubMed Central

    Hajishengallis, George; Maekawa, Tomoki; Abe, Toshiharu; Hajishengallis, Evlambia; Lambris, John D.

    2015-01-01

    The complement system is a network of interacting fluid-phase and cell surface-associated molecules that trigger, amplify, and regulate immune and inflammatory signaling pathways. Dysregulation of this finely balanced network can destabilize host-microbe homeostasis and cause inflammatory tissue damage. Evidence from clinical and animal model-based studies suggests that complement is implicated in the pathogenesis of periodontitis, a polymicrobial community-induced chronic inflammatory disease that destroys the tooth-supporting tissues. This review discusses molecular mechanisms of complement involvement in the dysbiotic transformation of the periodontal microbiome and the resulting destructive inflammation, culminating in loss of periodontal bone support. These mechanistic studies have additionally identified potential therapeutic targets. In this regard, interventional studies in preclinical models have provided proof-of-concept for using complement inhibitors for the treatment of human periodontitis. PMID:26306443

  12. 42 CFR 460.194 - Corrective action.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Corrective action. 460.194 Section 460.194 Public...) Federal/State Monitoring § 460.194 Corrective action. (a) A PACE organization must take action to correct... corrective actions. (c) Failure to correct deficiencies may result in sanctions or termination, as...

  13. Railway faults spreading model based on dynamics of complex network

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ma, Xin

    2015-12-01

    In this paper, we propose a railway faults spreading model which improved the SIR model and made it suitable for analyzing the dynamic process of faults spreading. To apply our model into a real network, the accident causation network of "7.23" China Yongwen high-speed railway accident is employed. This network is improved into a directed network, which more clearly reflects the causation relationships among the accident factors and provides help for our studies. Simulation results quantitatively show that the influence of failures can be diminished via choosing the appropriate initial recovery factors, reducing the time of the failure detected, decreasing the transmission rate of faults and increasing the propagating rate of corrected information. The model is useful to simulate the railway faults spreading and quantitatively analyze the influence of failures.

  14. Novel anticancer therapeutics targeting telomerase.

    PubMed

    Ruden, Maria; Puri, Neelu

    2013-08-01

    Telomeres are protective caps at the ends of human chromosomes. Telomeres shorten with each successive cell division in normal human cells whereas, in tumors, they are continuously elongated by human telomerase reverse transcriptase (hTERT). Telomerase is overexpressed in 80-95% of cancers and is present in very low levels or is almost undetectable in normal cells. Because telomerase plays a pivotal role in cancer cell growth it may serve as an ideal target for anticancer therapeutics. Inhibition of telomerase may lead to a decrease of telomere length resulting in cell senescence and apoptosis in telomerase positive tumors. Several strategies of telomerase inhibition are reviewed, including small molecule inhibitors, antisense oligonucleotides, immunotherapies and gene therapies, targeting the hTERT or the ribonucleoprotein subunit hTER. G-quadruplex stabilizers, tankyrase and HSP90 inhibitors targeting telomere and telomerase assembly, and T-oligo approach are also covered. Based on this review, the most promising current telomerase targeting therapeutics are the antisense oligonucleotide inhibitor GRN163L and immunotherapies that use dendritic cells (GRVAC1), hTERT peptide (GV1001) or cryptic peptides (Vx-001). Most of these agents have entered phase I and II clinical trials in patients with various tumors, and have shown good response rates as evidenced by a reduction in tumor cell growth, increased overall disease survival, disease stabilization in advanced staged tumors and complete/partial responses. Most therapeutics have shown to be more effective when used in combination with standard therapies, resulting in concomitant telomere shortening and tumor mass shrinkage, as well as preventing tumor relapse and resistance to single agent therapy. PMID:22841437

  15. [Therapeutic update in cystic fibrosis].

    PubMed

    Durupt, S; Nove Josserand, R; Durieu, I

    2014-06-01

    We present the recent therapeutic advances in the cystic fibrosis care. It concerns improvements in symptomatic treatment with the development of dry powder inhaled antibiotics that improved quality of life, and innovative treatments namely the modulators of the cystic fibrosis transmembrane protein conductance regulator (CFTR), molecules which act specifically at the level of the defective mechanisms implied in the disease. The life expectancy of cystic fibrosis patients born after 2000, is estimated now to be about 50 years. This improvement of survival was obtained with the organization of the care within the specialized centers for cystic fibrosis (Centre de ressource et de compétences de la mucoviscidose) and remains still based on heavy symptomatic treatments. Dry powder inhaled antibiotics constitute a significant time saving for patients to whom all the care can achieve two hours daily. Since 2012, the modulators of CFTR, molecules allowing a pharmacological approach targeted according to the type of the mutations, allows a more specific approach of the disease. Ivacaftor (Kalydeco(®)) which potentialises the function of the CFTR protein expressed on the cellular surface is now available for patients with the G551D mutation. Lumacaftor is going to be tested in association with ivacaftor in patients with the F508del mutation, that is present in at least 75% of the patients. The ataluren which allows the production of a functional protein CFTR in patients with a no sense mutation is the third representing of this new therapeutic class. We presently have numerous symptomatic treatments for the cystic fibrosis care. The development of CFTR modulators, today available to a restricted number of patients treated with ivacaftor represents a very promising therapeutic avenue. It will represent probably the first step to a personalized treatment according to CFTR genotype. PMID:24309546

  16. Milestones in Parkinson's disease therapeutics.

    PubMed

    Rascol, Olivier; Lozano, Andres; Stern, Matthew; Poewe, Werner

    2011-05-01

    In the mid-1980s, the treatment of Parkinson's disease was quite exclusively centered on dopatherapy and was focusing on dopamine systems and motor symptoms. A few dopamine agonists and a monoamine oxidase B inhibitor (selegiline) were used as adjuncts in advanced Parkinson's disease. In the early 2010s, levodopa remains the gold standard. New insights into the organization of the basal ganglia paved the way for deep brain stimulation, especially of the subthalamic nucleus, providing spectacular improvement of drug-refractory levodopa-induced motor complications. Novel dopamine agonists (pramipexole, ropinirole, rotigotine), catecholmethyltransferase inhibitors (entacapone), and monoamine oxidase B inhibitors (rasagiline) have also been developed to provide more continuous oral delivery of dopaminergic stimulation in order to improve motor outcomes. Using dopamine agonists early, before levodopa, proved to delay the onset of dyskinesia, although this is achieved at the price of potentially disabling daytime somnolence or impulse control disorders. The demonstration of an antidyskinetic effect of the glutamate antagonist amantadine opened the door for novel nondopaminergic approaches of Parkinson's disease therapy. More recently, nonmotor symptoms (depression, dementia, and psychosis) have been the focus of the first randomized controlled trials in this field. Despite therapeutic advances, Parkinson's disease continues to be a relentlessly progressive disorder leading to severe disability. Neuroprotective interventions able to modify the progression of Parkinson's disease have stood out as a failed therapeutic goal over the last 2 decades, despite potentially encouraging results with compounds like rasagiline. Newer molecular targets, new animal models, novel clinical trial designs, and biomarkers to assess disease modification have created hope for future therapeutic interventions. PMID:21626552

  17. Glycan analysis of therapeutic glycoproteins

    PubMed Central

    Zhang, Lei; Luo, Shen; Zhang, Baolin

    2016-01-01

    ABSTRACT Therapeutic monoclonal antibodies (mAbs) are glycoproteins produced by living cell systems. The glycan moieties attached to the proteins can directly affect protein stability, bioactivity, and immunogenicity. Therefore, glycan variants of a glycoprotein product must be adequately analyzed and controlled to ensure product quality. However, the inherent complexity of protein glycosylation poses a daunting analytical challenge. This review provides an update of recent advances in glycan analysis, including the potential utility of lectin-based microarray for high throughput glycan profiling. Emphasis is placed on comparison of the major types of analytics for use in determining unique glycan features such as glycosylation site, glycan structure, and content. PMID:26599345

  18. Therapeutic embolization: enhanced radiolabeled monitoring.

    PubMed

    duCret, R P; Adkins, M C; Hunter, D W; Yedlicka, J W; Engeler, C M; Castaneda-Zuniga, W R; Amplatz, K; Sirr, S A; Boudreau, R J; Kuni, C C

    1990-11-01

    Radiolabeling of Ivalon (polyvinyl alcohol sponge) particles permits localization of injected particles during embolization through the use of a portable gamma camera and provides a means to prevent potentially fatal complications such as pulmonary embolization. A more efficient technique of labeling Ivalon particles with technetium-99m sulfur colloid was developed. An increase in labeling efficiency allowed more accurate determination of the distribution of injected Ivalon particles. Scanning electron microscopy demonstrated the stability of the Ivalon particles during this new labeling process. Two patients with arteriovenous malformations underwent therapeutic embolization with radiolabeled Ivalon particles; gamma camera imaging of the lesion and chest was performed throughout the procedure. PMID:2217800

  19. Approaches for Therapeutic Temperature Management.

    PubMed

    Olson, DaiWai M; Hoffman, Jo

    2016-01-01

    In concert with an evolution toward an increased awareness of the need to tightly manage temperature, the methods used to monitor and manipulate temperature have evolved from mercury-filled glass thermometers, alcohol baths, and ice packs into a high technology-driven multidisciplinary activity. The purpose of this article is to provide a brief overview of the historical development of temperature management and the primary tenets of each of the 3 phases (induction, maintenance, and rewarming), which are now recognized as crucial steps to ensure the safe practice of therapeutic temperature management. PMID:26714116

  20. Progress in immunoconjugate cancer therapeutics.

    PubMed

    Payne, Gillian

    2003-03-01

    Advances in immunoconjugate technology have revitalized the "magic bullet" concept of immunotherapeutics for the treatment of cancer. The growing availability of "human" antibodies, the increased epitope repertoire due to genomics and proteomics efforts, and advances in the means of identification and production of tumor-specific antibodies have greatly increased the potential for cancer therapeutic opportunities. Furthermore, the realization that effector molecule potency must be sufficiently high to be effective at concentrations that might realistically be delivered to the tumor site on an antibody carrier has greatly spurred the fields of medicinal chemistry and radionuclide chelate chemistry to produce such molecules. PMID:12676579

  1. The therapeutic monoclonal antibody market

    PubMed Central

    Ecker, Dawn M; Jones, Susan Dana; Levine, Howard L

    2015-01-01

    Since the commercialization of the first therapeutic monoclonal antibody product in 1986, this class of biopharmaceutical products has grown significantly so that, as of November 10, 2014, forty-seven monoclonal antibody products have been approved in the US or Europe for the treatment of a variety of diseases, and many of these products have also been approved for other global markets. At the current approval rate of ∼ four new products per year, ∼70 monoclonal antibody products will be on the market by 2020, and combined world-wide sales will be nearly $125 billion. PMID:25529996

  2. Fibromyalgia syndrome: novel therapeutic targets.

    PubMed

    Ablin, Jacob N; Häuser, Winfried

    2016-05-01

    Fibromyalgia syndrome (FMS) is a chronic disorder characterized by widespread pain and tenderness, accompanied by disturbed sleep, chronic fatigue and multiple additional functional symptoms. FMS continues to pose an unmet need regarding pharmacological treatment and many patients fail to achieve sufficient relief from existing treatments. As FMS is considered to be a condition in which pain amplification occurs within the CNS, therapeutic interventions, both pharmacological and otherwise, have revolved around attempts to influence pain processing in the CNS. In the current review, we present an update on novel targets in the search for effective treatment of FMS. PMID:27296699

  3. Enactments in Psychoanalysis: Therapeutic Benefits.

    PubMed

    Stern, Stanley

    2016-01-01

    The therapeutic benefits of enactments are addressed. Relevant literature reveals disparate conceptions about the nature and use of enactments. Clarification of the term is discussed. This analyst's theoretical and technical evolution is addressed; it is inextricably related to using enactments. How can it not be? A taxonomy of enactments is presented. The article considers that enactments may be fundamental in the evolution from orthodox to contemporary analytic technique. Assumptions underlying enactments are explored, as are guidelines for using enactments. Finally, the article posits that enactments have widened the scope of analysis and contributed to its vitality. PMID:27200466

  4. Bioengineering Lantibiotics for Therapeutic Success

    PubMed Central

    Field, Des; Cotter, Paul D.; Hill, Colin; Ross, R. P.

    2015-01-01

    Several examples of highly modified antimicrobial peptides have been described. While many such peptides are non-ribosomally synthesized, ribosomally synthesized equivalents are being discovered with increased frequency. Of the latter group, the lantibiotics continue to attract most attention. In the present review, we discuss the implementation of in vivo and in vitro engineering systems to alter, and even enhance, the antimicrobial activity, antibacterial spectrum and physico-chemical properties, including heat stability, solubility, diffusion and protease resistance, of these compounds. Additionally, we discuss the potential applications of these lantibiotics for use as therapeutics. PMID:26640466

  5. Therapeutic Applications of Ionizing Radiations

    NASA Astrophysics Data System (ADS)

    Sánchez-Santos, María Elena

    The aim of radiation therapy is to deliver a precisely measured dose of radiation to a defined tumour volume with minimal damage to the surrounding healthy tissue, resulting in the eradication of the tumour, a higher quality of life with palliation of symptoms of the disease, and the prolongation of survival at competitive cost. Together with surgery and pharmacology, radiotherapy is presently one of the most important therapeutical weapons against cancer. This chapter provides an overview of the clinical use of radiation, with emphasis on the optimisation of treatment planning and delivery, and a top level summary of state-of-the-art techniques in radiation therapy.

  6. EXETRA Perspectives: Concepts in Therapeutic Recreation.

    ERIC Educational Resources Information Center

    Neal, Larry L.; Edginton, Christopher R.

    Fifteen papers address issues in therapeutic recreation for disabled persons from the perspectives of practitioners, educators, and students. The following papers are presented. "Therapeutic Recreation Service: The Past and Challenging Present" (H. Sessoms); "Therapeutic Recreatiion in an Era of Limits: A Crisis...A Challenge... An Opportunity"…

  7. Model-Based Engineering and Manufacturing CAD/CAM Benchmark

    SciTech Connect

    Domm, T.D.; Underwood, R.S.

    1999-04-26

    The Benehmark Project was created from a desire to identify best practices and improve the overall efficiency and performance of the Y-12 Plant's systems and personnel supprting the manufacturing mission. The mission of the benchmark team was to search out industry leaders in manufacturing and evaluate lheir engineering practices and processes to determine direction and focus fm Y-12 modmizadon efforts. The companies visited included several large established companies and anew, small, high-tech machining firm. As a result of this efforL changes are recommended that will enable Y-12 to become a more responsive cost-effective manufacturing facility capable of suppordng the needs of the Nuclear Weapons Complex (NW@) and Work Fw Others into the 21' century. The benchmark team identified key areas of interest, both focused and gencml. The focus arm included Human Resources, Information Management, Manufacturing Software Tools, and Standarda/ Policies and Practices. Areas of general interest included Inhstructure, Computer Platforms and Networking, and Organizational Structure. The method for obtaining the desired information in these areas centered on the creation of a benchmark questionnaire. The questionnaire was used throughout each of the visits as the basis for information gathering. The results of this benchmark showed that all companies are moving in the direction of model-based engineering and manufacturing. There was evidence that many companies are trying to grasp how to manage current and legacy data. In terms of engineering design software tools, the companies contacted were using both 3-D solid modeling and surfaced Wire-frame models. The manufacturing computer tools were varie4 with most companies using more than one software product to generate machining data and none currently performing model-based manufacturing (MBM) ftom a common medel. The majority of companies were closer to identifying or using a single computer-aided design (CAD) system than a

  8. Probabilistic model-based approach for heart beat detection.

    PubMed

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity. PMID:27480267

  9. Medical Device Integration Model Based on the Internet of Things

    PubMed Central

    Hao, Aiyu; Wang, Ling

    2015-01-01

    At present, hospitals in our country have basically established the HIS system, which manages registration, treatment, and charge, among many others, of patients. During treatment, patients need to use medical devices repeatedly to acquire all sorts of inspection data. Currently, the output data of the medical devices are often manually input into information system, which is easy to get wrong or easy to cause mismatches between inspection reports and patients. For some small hospitals of which information construction is still relatively weak, the information generated by the devices is still presented in the form of paper reports. When doctors or patients want to have access to the data at a given time again, they can only look at the paper files. Data integration between medical devices has long been a difficult problem for the medical information system, because the data from medical devices are lack of mandatory unified global standards and have outstanding heterogeneity of devices. In order to protect their own interests, manufacturers use special protocols, etc., thus causing medical decices to still be the "lonely island" of hospital information system. Besides, unfocused application of the data will lead to failure to achieve a reasonable distribution of medical resources. With the deepening of IT construction in hospitals, medical information systems will be bound to develop towards mobile applications, intelligent analysis, and interconnection and interworking, on the premise that there is an effective medical device integration (MDI) technology. To this end, this paper presents a MDI model based on the Internet of Things (IoT). Through abstract classification, this model is able to extract the common characteristics of the devices, resolve the heterogeneous differences between them, and employ a unified protocol to integrate data between devices. And by the IoT technology, it realizes interconnection network of devices and conducts associate matching

  10. An integrated model-based neurosurgical guidance system

    NASA Astrophysics Data System (ADS)

    Ji, Songbai; Fan, Xiaoyao; Fontaine, Kathryn; Hartov, Alex; Roberts, David; Paulsen, Keith

    2010-02-01

    Maximal tumor resection without damaging healthy tissue in open cranial surgeries is critical to the prognosis for patients with brain cancers. Preoperative images (e.g., preoperative magnetic resonance images (pMR)) are typically used for surgical planning as well as for intraoperative image-guidance. However, brain shift even at the start of surgery significantly compromises the accuracy of neuronavigation, if the deformation is not compensated for. Compensating for brain shift during surgical operation is, therefore, critical for improving the accuracy of image-guidance and ultimately, the accuracy of surgery. To this end, we have developed an integrated neurosurgical guidance system that incorporates intraoperative three-dimensional (3D) tracking, acquisition of volumetric true 3D ultrasound (iUS), stereovision (iSV) and computational modeling to efficiently generate model-updated MR image volumes for neurosurgical guidance. The system is implemented with real-time Labview to provide high efficiency in data acquisition as well as with Matlab to offer computational convenience in data processing and development of graphical user interfaces related to computational modeling. In a typical patient case, the patient in the operating room (OR) is first registered to pMR image volume. Sparse displacement data extracted from coregistered intraoperative US and/or stereovision images are employed to guide a computational model that is based on consolidation theory. Computed whole-brain deformation is then used to generate a model-updated MR image volume for subsequent surgical guidance. In this paper, we present the key modular components of our integrated, model-based neurosurgical guidance system.

  11. A Kp forecast model based on neural network

    NASA Astrophysics Data System (ADS)

    Gong, J.; Liu, Y.; Luo, B.; Liu, S.

    2013-12-01

    As an important global geomagnetic disturbance index, Kp is difficult to predict, especially when Kp reaches 5 which means that the disturbance has reached the scales of geomagnetic storm and can cause spacecraft and power system anomaly. Statistical results showed that there exists high correlation between solar wind-magnetosphere coupling function and Kp index, and a linear combination of two solar wind-magnetosphere coupling terms, merging term and viscous term, proved to be good in predicting the Kp index. In this study, using the upstream solar wind parameters by the ACE satellite since 1998 and the two derived coupling terms mentioned above, a Kp forecast model based on artificial neural network is developed. For the operational need of predicting the geomagnetic disturbance as soon as possible, we construct the solar wind data and develop the model in an innovative way. For each Kp value at time t (the universal times of 8 Kp values in each day are noted as t=3, 6, 9, ..., 18, 21, 24), the model gives 6 predicted values every half an hour at t-3.5, t-3.0, t-2.5, t-2.0, t-1.5, t-1.0, based on the half-hour averaged model inputs (solar wind parameters and derived solar wind-magnetosphere coupling terms). The last predicted value at t-1.0 provides the final prediction. Evaluated with the test set data including years 1998, 2002 and 2006, the model yields the linear correlation coefficient (LC) of 0.88 and the root mean square error (RMSE) of 0.65 between the modeled and observed Kp values. Furthermore, if the nowcast Kp is available and included in the model input, the model can be improved and gives an LC of 0.90 and an RMSE of 0.62.

  12. Model-based development of neuroprosthesis for paraplegic patients.

    PubMed Central

    Riener, R

    1999-01-01

    In paraplegic patients with upper motor neuron lesions the signal path from the central nervous system to the muscles is interrupted. Functional electrical stimulation applied to the lower motor neurons can replace the lacking signals. A so-called neuroprosthesis may be used to restore motor function in paraplegic patients on the basis of functional electrical stimulation. However, the control of multiple joints is difficult due to the complexity, nonlinearity, and time-variance of the system involved. Furthermore, effects such as muscle fatigue, spasticity, and limited force in the stimulated muscle further complicate the control task. Mathematical models of the human musculoskeletal system can support the development of neuroprosthesis. In this article a detailed overview of the existing work in the literature is given and two examples developed by the author are presented that give an insight into model-based development of neuroprosthesis for paraplegic patients. It is shown that modelling the musculoskeletal system can provide better understanding of muscular force production and movement coordination principles. Models can also be used to design and test stimulation patterns and feedback control strategies. Additionally, model components can be implemented in a controller to improve control performance. Eventually, the use of musculoskeletal models for neuroprosthesis design may help to avoid internal disturbances such as fatigue and optimize muscular force output. Furthermore, better controller quality can be obtained than in previous empirical approaches. In addition, the number of experimental tests to be performed with human subjects can be reduced. It is concluded that mathematical models play an increasing role in the development of reliable closed-loop controlled, lower extremity neuroprostheses. PMID:10382222

  13. Model-based damage evaluation of layered CFRP structures

    NASA Astrophysics Data System (ADS)

    Munoz, Rafael; Bochud, Nicolas; Rus, Guillermo; Peralta, Laura; Melchor, Juan; Chiachío, Juan; Chiachío, Manuel; Bond, Leonard J.

    2015-03-01

    An ultrasonic evaluation technique for damage identification of layered CFRP structures is presented. This approach relies on a model-based estimation procedure that combines experimental data and simulation of ultrasonic damage-propagation interactions. The CFPR structure, a [0/90]4s lay-up, has been tested in an immersion through transmission experiment, where a scan has been performed on a damaged specimen. Most ultrasonic techniques in industrial practice consider only a few features of the received signals, namely, time of flight, amplitude, attenuation, frequency contents, and so forth. In this case, once signals are captured, an algorithm is used to reconstruct the complete signal waveform and extract the unknown damage parameters by means of modeling procedures. A linear version of the data processing has been performed, where only Young modulus has been monitored and, in a second nonlinear version, the first order nonlinear coefficient β was incorporated to test the possibility of detection of early damage. The aforementioned physical simulation models are solved by the Transfer Matrix formalism, which has been extended from linear to nonlinear harmonic generation technique. The damage parameter search strategy is based on minimizing the mismatch between the captured and simulated signals in the time domain in an automated way using Genetic Algorithms. Processing all scanned locations, a C-scan of the parameter of each layer can be reconstructed, obtaining the information describing the state of each layer and each interface. Damage can be located and quantified in terms of changes in the selected parameter with a measurable extension. In the case of the nonlinear coefficient of first order, evidence of higher sensitivity to damage than imaging the linearly estimated Young Modulus is provided.

  14. Model Based Systems Engineering on the Europa Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bayer, Todd J.; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, I.; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert; Wagner, Dave

    2012-01-01

    At the start of 2011, the proposed Jupiter Europa Orbiter (JEO) mission was staffing up in expectation of becoming an official project later in the year for a launch in 2020. A unique aspect of the pre-project work was a strong emphasis and investment on the foundations of Model-Based Systems Engineering (MBSE). As so often happens in this business, plans changed: NASA's budget and science priorities were released and together fundamentally changed the course of JEO. As a result, it returned to being a study task whose objective is to propose more affordable ways to accomplish the science. As part of this transition, the question arose as to whether it could continue to afford the investment in MBSE. In short, the MBSE infusion has survived and is providing clear value to the study effort. By leveraging the existing infrastructure and a modest additional investment, striking advances in the capture and analysis of designs using MBSE were achieved. In the process, the need to remain relevant in the new environment has brought about a wave of innovation and progress. The effort has reaffirmed the importance of architecting. It has successfully harnessed the synergistic relationship of architecting to system modeling. We have found that MBSE can provide greater agility than traditional methods. We have also found that a diverse 'ecosystem' of modeling tools and languages (SysML, Mathematica, even Excel) is not only viable, but an important enabler of agility and adaptability. This paper will describe the successful application of MBSE in the dynamic environment of early mission formulation, the significant results produced and lessons learned in the process.

  15. Research on infrared imaging illumination model based on materials

    NASA Astrophysics Data System (ADS)

    Hu, Hai-he; Feng, Chao-yin; Guo, Chang-geng; Zheng, Hai-jing; Han, Qiang; Hu, Hai-yan

    2013-09-01

    In order to effectively simulate infrared features of the scene and infrared high light phenomenon, Based on the visual light illumination model, according to the optical property of all material types in the scene, the infrared imaging illumination models are proposed to fulfill different materials: to the smooth material with specular characteristic, adopting the infrared imaging illumination model based on Blinn-Phone reflection model and introducing the self emission; to the ordinary material which is similar to black body without highlight feature, ignoring the computation of its high light reflection feature, calculating simply the material's self emission and its reflection to the surrounding as its infrared imaging illumination model, the radiation energy under zero range of visibility can be obtained according to the above two models. The OpenGl rendering technology is used to construct infrared scene simulation system which can also simulate infrared electro-optical imaging system, then gets the synthetic infrared images from any angle of view of the 3D scenes. To validate the infrared imaging illumination model, two typical 3D scenes are made, and their infrared images are calculated to compare and contrast with the real collected infrared images obtained by a long wave infrared band imaging camera. There are two major points in the paper according to the experiment results: firstly, the infrared imaging illumination models are capable of producing infrared images which are very similar to those received by thermal infrared camera; secondly, the infrared imaging illumination models can simulate the infrared specular feature of relative materials and common infrared features of general materials, which shows the validation of the infrared imaging illumination models. Quantitative analysis shows that the simulation images are similar to the collected images in the aspects of main features, but their histogram distribution does not match very well, the

  16. Model-based lamotrigine clearance changes during pregnancy: clinical implication

    PubMed Central

    Polepally, Akshanth R; Pennell, Page B; Brundage, Richard C; Stowe, Zachary N; Newport, Donald J; Viguera, Adele C; Ritchie, James C; Birnbaum, Angela K

    2014-01-01

    Objective The objective of the study was to characterize changes in the oral clearance (CL/F) of lamotrigine (LTG) over the course of pregnancy and the postpartum period through a model-based approach incorporating clinical characteristics that may influence CL/F, in support of developing clinical management guidelines. Methods Women receiving LTG therapy who were pregnant or planning pregnancy were enrolled. Maternal blood samples were collected at each visit. A pharmacokinetic analysis was performed using a population-based, nonlinear, mixed-effects model. Results A total of 600 LTG concentrations from 60 women (64 pregnancies) were included. The baseline LTG CL/F was 2.16 L/h with a between-subject variability of 40.6%. The influence of pregnancy on CL/F was described by gestational week. Two subpopulations of women emerged based on the rate of increase in LTG CL/F during pregnancy. The gestational age-associated increase in CL/F displayed a 10-fold higher rate in 77% of the women (0.118 L/h per week) compared to 23% (0.0115 L/h per week). The between-subject variability in these slopes was 43.0%. The increased CL/F at delivery declined to baseline values with a half-life of 0.55 weeks. Interpretation The majority of women had a substantial increase in CL/F from 2.16 to 6.88 L/h by the end of pregnancy, whereas 23% of women had a minimal increase. An increase in CL/F may correspond to decreases in LTG blood concentrations necessitating the need for more frequent dosage adjustments and closer monitoring in some pregnant women with epilepsy. Postpartum doses should be tapered to preconception dose ranges within 3 weeks of delivery. PMID:24883336

  17. Medical Device Integration Model Based on the Internet of Things.

    PubMed

    Hao, Aiyu; Wang, Ling

    2015-01-01

    At present, hospitals in our country have basically established the HIS system, which manages registration, treatment, and charge, among many others, of patients. During treatment, patients need to use medical devices repeatedly to acquire all sorts of inspection data. Currently, the output data of the medical devices are often manually input into information system, which is easy to get wrong or easy to cause mismatches between inspection reports and patients. For some small hospitals of which information construction is still relatively weak, the information generated by the devices is still presented in the form of paper reports. When doctors or patients want to have access to the data at a given time again, they can only look at the paper files. Data integration between medical devices has long been a difficult problem for the medical information system, because the data from medical devices are lack of mandatory unified global standards and have outstanding heterogeneity of devices. In order to protect their own interests, manufacturers use special protocols, etc., thus causing medical decices to still be the "lonely island" of hospital information system. Besides, unfocused application of the data will lead to failure to achieve a reasonable distribution of medical resources. With the deepening of IT construction in hospitals, medical information systems will be bound to develop towards mobile applications, intelligent analysis, and interconnection and interworking, on the premise that there is an effective medical device integration (MDI) technology. To this end, this paper presents a MDI model based on the Internet of Things (IoT). Through abstract classification, this model is able to extract the common characteristics of the devices, resolve the heterogeneous differences between them, and employ a unified protocol to integrate data between devices. And by the IoT technology, it realizes interconnection network of devices and conducts associate matching

  18. A turbulent inflow model based on velocity modulation

    NASA Astrophysics Data System (ADS)

    Huyer, Stephen A.; Beal, David

    2007-11-01

    This article presents a novel turbulent inflow model based on modulation of the velocity field for use with time-domain propulsor calculations. Given an experimental mean and rms turbulent inflow, a model can be constructed by modulating the velocity field over a range of frequencies. Assuming the turbulence is homogeneous, the inflow can be constructed as a Fourier series where the frequencies can also be modulated to smooth the broadband output. To demonstrate the effectiveness of the model, experimental inflow velocity data were acquired for an upstream stator, downstream rotor configuration mounted on an undersea vehicle afterbody. Two main sources of turbulence originated from the vorticity shed from the stator wakes and the boundary layer vorticity produced on the hull body. Three-dimensional, unsteady velocity data were acquired using hot-wire anemometry and reduced to provide mean and rms velocity values. Time-series data were processed to provide velocity power spectra used to calibrate the model. Simulations were performed using a modified version of the propulsor unsteady flow code capable of computing fully turbulent inflows. This solver models the propulsor blade as a vortex lattice and sheds the vorticity into the wake to solve the unsteady potential flow. The no-flux boundary conditions are satisfied at the lattice control points and the resulting unsteady circulation is a function of the instantaneous inflow velocity field over the blade. Vorticity is shed into the wake to account for the full time history of the inflow velocity field. To demonstrate the full effectiveness of the model, computed surface pressure data were exported to a code to compute the far-field radiated noise (both tonal and broadband). Simulated data were compared with experimentally obtained noise data with favorable results. Applications of this methodology in the incompressible flow domain include broadband analysis of propulsor-radiated noise on undersea vehicles and

  19. Model-based adhesive shrinkage compensation for increased bonding repeatability

    NASA Astrophysics Data System (ADS)

    Müller, Tobias; Schlette, Christian; Lakshmanan, Shunmuganathan; Haag, Sebastian; Zontar, Daniel; Sauer, Sebastian; Wenzel, Christian; Brecher, Christian; Roβmann, Jürgen

    2016-03-01

    The assembly process of optical components consists of two phases - the alignment and the bonding phase. Precision - or better process repeatability - is limited by the latter one. The limitation of the alignment precision is given by the measurement equipment and the manipulation technology applied. Today's micromanipulators in combination with beam imaging setups allow for an alignment in the range of far below 100nm. However, once precisely aligned optics need to be fixed in their position. State o f the art in optics bonding for laser systems is adhesive bonding with UV-curing adhesives. Adhesive bonding is a multi-factorial process and thus subject to statistical process deviations. As a matter of fact, UV-curing adhesives inherit shrinkage effects during their curing process, making offsets for shrinkage compensation mandatory. Enhancing the process control of the adhesive bonding process is the major goal of the activities described in this paper. To improve the precision of shrinkage compensation a dynamic shrinkage prediction is envisioned by Fraunhofer IPT. Intense research activities are being practiced to gather a deeper understanding of the parameters influencing adhesive shrinkage behavior. These effects are of different nature - obviously being the raw adhesive material itself as well as its condition, the bonding geometry, environmental parameters like surrounding temperature and of course process parameters such as curing properties. Understanding the major parameters and linking them in a model-based shrinkage-prediction environment is the basis for improved process control. Results are being deployed by Fraunhofer in prototyping, as well as volume production solutions for laser systems.

  20. Novel model-based dosing guidelines for gentamicin and tobramycin in preterm and term neonates

    PubMed Central

    Valitalo, Pyry A. J.; van den Anker, John N.; Allegaert, Karel; de Cock, Roosmarijn F. W.; de Hoog, Matthijs; Simons, Sinno H. P.; Mouton, Johan W.; Knibbe, Catherijne A. J.

    2015-01-01

    Objectives In the heterogeneous group of preterm and term neonates, gentamicin and tobramycin are mainly dosed according to empirical guidelines, after which therapeutic drug monitoring and subsequent dose adaptation are applied. In view of the variety of neonatal guidelines available, the purpose of this study was to evaluate target concentration attainment of these guidelines, and to propose a new model-based dosing guideline for these drugs in neonates. Methods Demographic characteristics of 1854 neonates (birth weight 390–5200 g, post-natal age 0–27 days) were extracted from earlier studies and sampled to obtain a test dataset of 5000 virtual patients. Monte Carlo simulations on the basis of validated models were undertaken to evaluate the attainment of target peak (5–12 mg/L) and trough (<0.5 mg/L) concentrations, and cumulative AUC, with the existing and proposed guidelines. Results Across the entire neonatal age and weight range, the Dutch National Formulary for Children, the British National Formulary for Children, Neofax and the Red Book resulted in adequate peak but elevated trough concentrations (63%–90% above target). The proposed dosing guideline (4.5 mg/kg gentamicin or 5.5 mg/kg tobramycin) with a dosing interval based on birth weight and post-natal age leads to adequate peak concentrations with only 33%–38% of the trough concentrations above target, and a constant AUC across weight and post-natal age. Conclusions The proposed neonatal dosing guideline for gentamicin and tobramycin results in improved attainment of target concentrations and should be prospectively evaluated in clinical studies to evaluate the efficacy and safety of this treatment. PMID:25766737