Sample records for source coding dsc

  1. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things

    PubMed Central

    Akan, Ozgur B.

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST). PMID:29538405

  2. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.

    PubMed

    Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).

  3. Multispectral Image Compression Based on DSC Combined with CCSDS-IDC

    PubMed Central

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches. PMID:25110741

  4. Multispectral image compression based on DSC combined with CCSDS-IDC.

    PubMed

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches.

  5. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The Multidimensional Influence of Acculturation on Digit Symbol-Coding and Wisconsin Card Sorting Test in Hispanics.

    PubMed

    Krch, Denise; Lequerica, Anthony; Arango-Lasprilla, Juan Carlos; Rogers, Heather L; DeLuca, John; Chiaravalloti, Nancy D

    2015-01-01

    The purpose of the current study was to evaluate the relative contribution of acculturation to two tests of nonverbal test performance in Hispanics. This study compared 40 Hispanic and 20 non-Hispanic whites on Digit Symbol-Coding (DSC) and the Wisconsin Card Sorting Test (WCST) and evaluated the relative contribution of the various acculturation components to cognitive test performance in the Hispanic group. Hispanics performed significantly worse on DSC and WCST relative to non-Hispanic whites. Multiple regressions conducted within the Hispanic group revealed that language use uniquely accounted for 11.0% of the variance on the DSC, 18.8% of the variance on WCST categories completed, and 13.0% of the variance in perseverative errors on the WCST. Additionally, years of education in the United States uniquely accounted for 14.9% of the variance in DSC. The significant impact of acculturation on DSC and WCST lends support that nonverbal cognitive tests are not necessarily culture free. The differential contribution of acculturation proxies highlights the importance of considering these separate components when interpreting performance on neuropsychological tests in clinical and research settings. Factors, such as the country where education was received, may in fact be more meaningful information than the years of education of education attained. Thus, acculturation should be considered an important factor in any cognitive evaluation of culturally diverse individuals.

  7. Dye-sensitized solar cells for efficient power generation under ambient lighting

    NASA Astrophysics Data System (ADS)

    Freitag, Marina; Teuscher, Joël; Saygili, Yasemin; Zhang, Xiaoyu; Giordano, Fabrizio; Liska, Paul; Hua, Jianli; Zakeeruddin, Shaik M.; Moser, Jacques-E.; Grätzel, Michael; Hagfeldt, Anders

    2017-06-01

    Solar cells that operate efficiently under indoor lighting are of great practical interest as they can serve as electric power sources for portable electronics and devices for wireless sensor networks or the Internet of Things. Here, we demonstrate a dye-sensitized solar cell (DSC) that achieves very high power-conversion efficiencies (PCEs) under ambient light conditions. Our photosystem combines two judiciously designed sensitizers, coded D35 and XY1, with the copper complex Cu(II/I)(tmby) as a redox shuttle (tmby, 4,4‧,6,6‧-tetramethyl-2,2‧-bipyridine), and features a high open-circuit photovoltage of 1.1 V. The DSC achieves an external quantum efficiency for photocurrent generation that exceeds 90% across the whole visible domain from 400 to 650 nm, and achieves power outputs of 15.6 and 88.5 μW cm-2 at 200 and 1,000 lux, respectively, under illumination from a model Osram 930 warm-white fluorescent light tube. This translates into a PCE of 28.9%.

  8. Separation of Evans and Hiro currents in VDE of tokamak plasma

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Svidzinski, V. A.; Zakharov, L. E.

    2014-10-01

    Progress on the Disruption Simulation Code (DSC-3D) development and benchmarking will be presented. The DSC-3D is one-fluid nonlinear time-dependent MHD code, which utilizes fully 3D toroidal geometry for the first wall, pure vacuum and plasma itself, with adaptation to the moving plasma boundary and accurate resolution of the plasma surface current. Suppression of fast magnetosonic scale by the plasma inertia neglecting will be demonstrated. Due to code adaptive nature, self-consistent plasma surface current modeling during non-linear dynamics of the Vertical Displacement Event (VDE) is accurately provided. Separation of the plasma surface current on Evans and Hiro currents during simulation of fully developed VDE, then the plasma touches in-vessel tiles, will be discussed. Work is supported by the US DOE SBIR Grant # DE-SC0004487.

  9. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  10. Wall-touching kink mode calculations with the M3D code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breslau, J. A., E-mail: jbreslau@pppl.gov; Bhattacharjee, A.

    This paper seeks to address a controversy regarding the applicability of the 3D nonlinear extended MHD code M3D [W. Park et al., Phys. Plasmas 6, 1796 (1999)] and similar codes to calculations of the electromagnetic interaction of a disrupting tokamak plasma with the surrounding vessel structures. M3D is applied to a simple test problem involving an external kink mode in an ideal cylindrical plasma, used also by the Disruption Simulation Code (DSC) as a model case for illustrating the nature of transient vessel currents during a major disruption. While comparison of the results with those of the DSC is complicatedmore » by effects arising from the higher dimensionality and complexity of M3D, we verify that M3D is capable of reproducing both the correct saturation behavior of the free boundary kink and the “Hiro” currents arising when the kink interacts with a conducting tile surface interior to the ideal wall.« less

  11. 76 FR 44977 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... packing of cargo transport units. --Consideration for the efficacy of Container Inspection Programme... Dangerous Goods, Solid Cargoes and Containers (DSC 16) to be held at IMO Headquarters, London, United... Solid Bulk Cargoes Code (IMSBC Code) including evaluation of properties of solid bulk cargos. --Casualty...

  12. Functional characterization of the Dsc E3 ligase complex in the citrus postharvest pathogen Penicillium digitatum.

    PubMed

    Ruan, Ruoxin; Chung, Kuang-Ren; Li, Hongye

    2017-12-01

    Sterol regulatory element binding proteins (SREBPs) are required for sterol homeostasis in eukaryotes. Activation of SREBPs is regulated by the Dsc E3 ligase complex in Schizosaccharomyces pombe and Aspergillus spp. Previous studies indicated that an SREBP-coding gene PdsreA is required for fungicide resistance and ergosterol biosynthesis in the citrus postharvest pathogen Penicillium digitatum. In this study, five genes, designated PddscA, PddscB, PddscC, PddscD, and PddscE encoding the Dsc E3 ligase complex were characterized to be required for fungicide resistance, ergosterol biosynthesis and CoCl 2 tolerance in P. digitatum. Each of the dsc genes was inactivated by target gene disruption and the resulted phenotypes were analyzed and compared. Genetic analysis reveals that, of five Dsc complex components, PddscB is the core subunit gene in P. digitatum. Although the resultant dsc mutants were able to infect citrus fruit and induce maceration lesions as the wild-type, the mutants rarely produced aerial mycelia on affected citrus fruit peels. P. digitatum Dsc proteins regulated not only the expression of genes involved in ergosterol biosynthesis but also that of PdsreA. Yeast two-hybrid assays revealed a direct interaction between the PdSreA protein and the Dsc proteins. Ectopic expression of the PdSreA N-terminus restored fungicide resistance in the dsc mutants. Our results provide important evidence to understand the mechanisms underlying SREBP activation and regulation of ergosterol biosynthesis in plant pathogenic fungi. Copyright © 2017 Elsevier GmbH. All rights reserved.

  13. The Optimal Employment of a Deep Seaweb Acoustic Network for Submarine Communications at Speed and Depth Using a Defender-Attacker-Defender Model

    DTIC Science & Technology

    2013-09-01

    Figure 17. Reliable acoustic paths from a deep source to shallow receivers (From Urick 1983... Urick 1983). ..................................................................28 Figure 19. Computer generated ray diagram of the DSC for a source...near the axis. Reflected rays are omitted (From Urick 1983). .........................................29 Figure 20. Worldwide DSC axis depths in

  14. Quantitative measurement of indomethacin crystallinity in indomethacin-silica gel binary system using differential scanning calorimetry and X-ray powder diffractometry.

    PubMed

    Pan, Xiaohong; Julian, Thomas; Augsburger, Larry

    2006-02-10

    Differential scanning calorimetry (DSC) and X-ray powder diffractometry (XRPD) methods were developed for the quantitative analysis of the crystallinity of indomethacin (IMC) in IMC and silica gel (SG) binary system. The DSC calibration curve exhibited better linearity than that of XRPD. No phase transformation occurred in the IMC-SG mixtures during DSC measurement. The major sources of error in DSC measurements were inhomogeneous mixing and sampling. Analyzing the amount of IMC in the mixtures using high-performance liquid chromatography (HPLC) could reduce the sampling error. DSC demonstrated greater sensitivity and had less variation in measurement than XRPD in quantifying crystalline IMC in the IMC-SG binary system.

  15. Probing the heat sources during thermal runaway process by thermal analysis of different battery chemistries

    NASA Astrophysics Data System (ADS)

    Zheng, Siqi; Wang, Li; Feng, Xuning; He, Xiangming

    2018-02-01

    Safety issue is very important for the lithium ion battery used in electric vehicle or other applications. This paper probes the heat sources in the thermal runaway processes of lithium ion batteries composed of different chemistries using accelerating rate calorimetry (ARC) and differential scanning calorimetry (DSC). The adiabatic thermal runaway features for the 4 types of commercial lithium ion batteries are tested using ARC, whereas the reaction characteristics of the component materials, including the cathode, the anode and the separator, inside the 4 types of batteries are measured using DSC. The peaks and valleys of the critical component reactions measured by DSC can match the fluctuations in the temperature rise rate measured by ARC, therefore the relevance between the DSC curves and the ARC curves is utilized to probe the heat source in the thermal runaway process and reveal the thermal runaway mechanisms. The results and analysis indicate that internal short circuit is not the only way to thermal runaway, but can lead to extra electrical heat, which is comparable with the heat released by chemical reactions. The analytical approach of the thermal runaway mechanisms in this paper can guide the safety design of commercial lithium ion batteries.

  16. Forensic characterization of HDPE pipes by DSC.

    PubMed

    Sajwan, Madhuri; Aggarwal, Saroj; Singh, R B

    2008-03-05

    The melting behavior of 28 high density polyethylene (HDPE) pipe samples manufactured and supplied by 13 different manufacturers in India was examined by 'differential scanning calorimetry (DSC)' to find out if this parameter could be used in differentiating between these HDPE pipe samples which are chemically the same and being manufactured by different manufacturer. The results indicate that the melting temperature may serve as the useful criteria for differentiating HDPE (i) pipe samples from different sources and (ii) samples of different diameter from the same source.

  17. Designing single- and multiple-shell sampling schemes for diffusion MRI using spherical code.

    PubMed

    Cheng, Jian; Shen, Dinggang; Yap, Pew-Thian

    2014-01-01

    In diffusion MRI (dMRI), determining an appropriate sampling scheme is crucial for acquiring the maximal amount of information for data reconstruction and analysis using the minimal amount of time. For single-shell acquisition, uniform sampling without directional preference is usually favored. To achieve this, a commonly used approach is the Electrostatic Energy Minimization (EEM) method introduced in dMRI by Jones et al. However, the electrostatic energy formulation in EEM is not directly related to the goal of optimal sampling-scheme design, i.e., achieving large angular separation between sampling points. A mathematically more natural approach is to consider the Spherical Code (SC) formulation, which aims to achieve uniform sampling by maximizing the minimal angular difference between sampling points on the unit sphere. Although SC is well studied in the mathematical literature, its current formulation is limited to a single shell and is not applicable to multiple shells. Moreover, SC, or more precisely continuous SC (CSC), currently can only be applied on the continuous unit sphere and hence cannot be used in situations where one or several subsets of sampling points need to be determined from an existing sampling scheme. In this case, discrete SC (DSC) is required. In this paper, we propose novel DSC and CSC methods for designing uniform single-/multi-shell sampling schemes. The DSC and CSC formulations are solved respectively by Mixed Integer Linear Programming (MILP) and a gradient descent approach. A fast greedy incremental solution is also provided for both DSC and CSC. To our knowledge, this is the first work to use SC formulation for designing sampling schemes in dMRI. Experimental results indicate that our methods obtain larger angular separation and better rotational invariance than the generalized EEM (gEEM) method currently used in the Human Connectome Project (HCP).

  18. 77 FR 47491 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-08

    ... Session of the International Maritime Organization's (IMO) Sub-Committee on Dangerous Goods, Solid Cargoes and Containers (DSC 17) to be held at the IMO Headquarters, United Kingdom, September 17-21. The... to the International Maritime Solid Bulk Cargoes (IMSBC) Code and supplements --Amendments to SOLAS...

  19. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  20. 75 FR 66828 - Eleventh Meeting: RTCA Special Committee 214: Working Group 78: Standards for Air Traffic Data...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    .... Now TORS(OCL/DSC, update ED 154/Doxxx, * * *). Outcome FRAC/consultation DO306/ED 122 and Publication... TORs and Work Plan. Review of Position Papers and Contributions. 13:30-17:00: Plenary Session.... Robert L. Bostiga, RTCA Advisory Committee. [FR Doc. 2010-27260 Filed 10-28-10; 8:45 am] BILLING CODE...

  1. Joint reconstruction of dynamic PET activity and kinetic parametric images using total variation constrained dictionary sparse coding

    NASA Astrophysics Data System (ADS)

    Yu, Haiqing; Chen, Shuhang; Chen, Yunmei; Liu, Huafeng

    2017-05-01

    Dynamic positron emission tomography (PET) is capable of providing both spatial and temporal information of radio tracers in vivo. In this paper, we present a novel joint estimation framework to reconstruct temporal sequences of dynamic PET images and the coefficients characterizing the system impulse response function, from which the associated parametric images of the system macro parameters for tracer kinetics can be estimated. The proposed algorithm, which combines statistical data measurement and tracer kinetic models, integrates a dictionary sparse coding (DSC) into a total variational minimization based algorithm for simultaneous reconstruction of the activity distribution and parametric map from measured emission sinograms. DSC, based on the compartmental theory, provides biologically meaningful regularization, and total variation regularization is incorporated to provide edge-preserving guidance. We rely on techniques from minimization algorithms (the alternating direction method of multipliers) to first generate the estimated activity distributions with sub-optimal kinetic parameter estimates, and then recover the parametric maps given these activity estimates. These coupled iterative steps are repeated as necessary until convergence. Experiments with synthetic, Monte Carlo generated data, and real patient data have been conducted, and the results are very promising.

  2. Characterization of melt-quenched and milled amorphous solids of gatifloxacin.

    PubMed

    Hattori, Yusuke; Suzuki, Ayumi; Otsuka, Makoto

    2016-11-01

    The objectives of this study were to characterize and investigate the differences in amorphous states of gatifloxacin. We prepared two types of gatifloxacin amorphous solids coded as M and MQ using milling and melt-quenching methods, respectively. The amorphous solids were characterized via X-ray diffraction (XRD), nonisothermal differential scanning calorimetry (DSC) and time-resolved near-infrared (NIR) spectroscopy. Both the solids displayed halo XRD patterns, the characteristic of amorphous solids; however, in the non-isothermal DSC profiles, these amorphous solids were distinguished by their crystallization and melting temperatures. The Kissinger-Akahira-Sunose plots of non-isothermal crystallization temperatures at various heating rates indicated a lower activation energy of crystallization for the amorphous solid M than that of MQ. These results support the differentiation between two amorphous states with different physical and chemical properties.

  3. Dynamic susceptibility contrast-enhanced perfusion MR imaging at 1.5 T predicts final infarct size in a rat stroke model.

    PubMed

    Chen, Feng; Suzuki, Yasuhiro; Nagai, Nobuo; Peeters, Ronald; Marchal, Guy; Ni, Yicheng

    2005-01-30

    The purpose of the present animal experiment was to determine whether source images from dynamic susceptibility contrast-enhanced perfusion weighted imaging (DSC-PWI) at a 1.5T MR scanner, performed early after photochemically induced thrombosis (PIT) of cerebral middle artery (MCA), is feasible to predict final cerebral infarct size in a rat stroke model. Fifteen rats were subjected to PIT of proximal MCA. T2 weighted imaging (T2WI), diffusion-weighted imaging (DWI), and contrast-enhanced PWI were obtained at 1 h and 24 h after MCA occlusion. The relative lesion size (RLS) was defined as lesion volume/brain volume x 100% and measured for MR images, and compared with the final RLS on the gold standard triphenyl tetrazolium chloride (TTC) staining at 24 h. One hour after MCA occlusion, the RLS with DSC-PWI was 24.9 +/- 6.3%, which was significantly larger than 17.6 +/- 4.8% with DWI (P < 0.01). At 24 h, the final RLS on TTC was 24.3 +/- 4.8%, which was comparable to 25.1 +/- 3.5%, 24.6 +/- 3.6% and 27.9 +/- 6.8% with T2WI, DWI and DSC-PWI respectively (P > 0.05). The fact that at 1 h after MCA occlusion only the displayed perfusion deficit was similar to the final infarct size on TTC (P > 0.05) suggests that early source images from DSC-PWI at 1.5T MR scanner is feasible to noninvasively predict the final infarct size in rat models of stroke.

  4. Differential Scanning Calorimetry Techniques: Applications in Biology and Nanoscience

    PubMed Central

    Gill, Pooria; Moghadam, Tahereh Tohidi; Ranjbar, Bijan

    2010-01-01

    This paper reviews the best-known differential scanning calorimetries (DSCs), such as conventional DSC, microelectromechanical systems-DSC, infrared-heated DSC, modulated-temperature DSC, gas flow-modulated DSC, parallel-nano DSC, pressure perturbation calorimetry, self-reference DSC, and high-performance DSC. Also, we describe here the most extensive applications of DSC in biology and nanoscience. PMID:21119929

  5. Coupling distributed stormwater collection and managed aquifer recharge: Field application and implications.

    PubMed

    Beganskas, S; Fisher, A T

    2017-09-15

    Groundwater is increasingly important for satisfying California's growing fresh water demand. Strategies like managed aquifer recharge (MAR) can improve groundwater supplies, mitigating the negative consequences of persistent groundwater overdraft. Distributed stormwater collection (DSC)-MAR projects collect and infiltrate excess hillslope runoff before it reaches a stream, focusing on 40-400 ha drainage areas (100-1000 ac). We present results from six years of DSC-MAR operation-including high resolution analyses of precipitation, runoff generation, infiltration, and sediment transport-and discuss their implications for regional resource management. This project generated significant water supply benefit over six years, including an extended regional drought, collecting and infiltrating 5.3 × 10 5  m 3 (426 ac-ft). Runoff generation was highly sensitive to sub-daily storm frequency, duration, and intensity, and a single intense storm often accounted for a large fraction of annual runoff. Observed infiltration rates varied widely in space and time. The basin-average infiltration rate during storms was 1-3 m/d, with point-specific rates up to 8 m/d. Despite efforts to limit sediment load, 8.2 × 10 5  kg of fine-grained sediment accumulated in the infiltration basin over three years, likely reducing soil infiltration capacity. Periodic removal of accumulated material, better source control, and/or improved sediment detention could mitigate this effect in the future. Regional soil analyses can maximize DSC-MAR benefits by identifying high-infiltration capacity features and characterizing upland sediment sources. A regional network of DSC-MAR projects could increase groundwater supplies while contributing to improved groundwater quality, flood mitigation, and stakeholder engagement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Comparison of the LLNL ALE3D and AKTS Thermal Safety Computer Codes for Calculating Times to Explosion in ODTX and STEX Thermal Cookoff Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K

    2006-04-05

    Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less

  7. Remote Evaluation of Rotational Velocity Using a Quadrant Photo-Detector and a DSC Algorithm

    PubMed Central

    Zeng, Xiangkai; Zhu, Zhixiong; Chen, Yang

    2016-01-01

    This paper presents an approach to remotely evaluate the rotational velocity of a measured object by using a quadrant photo-detector and a differential subtraction correlation (DSC) algorithm. The rotational velocity of a rotating object is determined by two temporal-delay numbers at the minima of two DSCs that are derived from the four output signals of the quadrant photo-detector, and the sign of the calculated rotational velocity directly represents the rotational direction. The DSC algorithm does not require any multiplication operations. Experimental calculations were performed to confirm the proposed evaluation method. The calculated rotational velocity, including its amplitude and direction, showed good agreement with the given one, which had an amplitude error of ~0.3%, and had over 1100 times the efficiency of the traditional cross-correlation method in the case of data number N > 4800. The confirmations have shown that the remote evaluation of rotational velocity can be done without any circular division disk, and that it has much fewer error sources, making it simple, accurate and effective for remotely evaluating rotational velocity. PMID:27120607

  8. Advanced imaging techniques in brain tumors

    PubMed Central

    2009-01-01

    Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287

  9. Dynamics of tokamak plasma surface current in 3D ideal MHD model

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Svidzinski, V. A.; Zakharov, L. E.

    2013-10-01

    Interest in the surface current which can arise on perturbed sharp plasma vacuum interface in tokamaks was recently generated by a few papers (see and references therein). In dangerous disruption events with plasma-touching-wall scenarios, the surface current can be shared with the wall leading to the strong, damaging forces acting on the wall A relatively simple analytic definition of δ-function surface current proportional to a jump of tangential component of magnetic field nevertheless leads to a complex computational problem on the moving plasma-vacuum interface, requiring the incorporation of non-linear 3D plasma dynamics even in one-fluid ideal MHD. The Disruption Simulation Code (DSC), which had recently been developed in a fully 3D toroidal geometry with adaptation to the moving plasma boundary, is an appropriate tool for accurate self-consistent δfunction surface current calculation. Progress on the DSC-3D development will be presented. Self-consistent surface current calculation under non-linear dynamics of low m kink mode and VDE will be discussed. Work is supported by the US DOE SBIR grant #DE-SC0004487.

  10. Glass transition behavior of polystyrene/silica nanocomposites.

    NASA Astrophysics Data System (ADS)

    Xie, Yuping; Sen, Sudeepto; Kumar, Sanat; Bansal, Amitabh

    2006-03-01

    The change in thermomechanical properties of nano-filled polymers is of considerable scientific and technological interest. The interaction between the nanofillers and the matrix polymer controls the nanocomposite properties. We will present the results from recent and ongoing DSC experiments on polystyrene/silica nanocomposites. Polystyrene of different molecular weights (and from different sources) and silica nanoparticles 10-15 nm in diameter (both as received from Nissan and surface modified by grafted or physisorbed polystyrene) are being used to process the nanocomposites. We are studying trends in the glass transition behavior by changing the matrix molecular weights and the silica weight fractions. Recent data indicate that the glass transition temperature can both decrease and increase depending on the polymer-nanofiller combination as well as the thermal treatment of the nanocomposites prior to the DSC runs.

  11. Study of the recrystallization in coated pellets - effect of coating on API crystallinity.

    PubMed

    Nikowitz, Krisztina; Pintye-Hódi, Klára; Regdon, Géza

    2013-02-14

    Coated diltiazem hydrochloride-containing pellets were prepared using the solution layering technique. Unusual thermal behavior was detected with differential scanning calorimetry (DSC) and its source was determined using thermogravimetry (TG), X-ray powder diffraction (XRPD) and hot-stage microscopy. The coated pellets contained diltiazem hydrochloride both in crystalline and amorphous form. Crystallization occurs on heat treatment causing an exothermic peak on the DSC curves that only appears in pellets containing both diltiazem hydrochloride and the coating. Results indicate that the amorphous fraction is situated in the coating layer. The migration of drugs into the coating layer can cause changes in its degree of crystallinity. Polymeric coating materials should therefore be investigated as possible crystallization inhibitors. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Biodiesel: Characterization by DSC and P-DSC

    NASA Astrophysics Data System (ADS)

    Chiriac, Rodica; Toche, François; Brylinski, Christian

    Thermal analytical methods such as differential scanning calorimetry (DSC) have been successfully applied to neat petrodiesel and engine oils in the last 25 years. This chapter shows how DSC and P-DSC (pressurized DSC) techniques can be used to compare, characterize, and predict some properties of alternative non-petroleum fuels, such as cold flow behavior and oxidative stability. These two properties are extremely important with respect to the operability, transport, and long-term storage of biodiesel fuel. It is shown that the quantity of unsaturated fatty acids in the fuel composition has an important impact on both properties. In addition, it is shown that the impact of fuel additives on the oxidative stability or the cold flow behavior of biodiesel can be studied by means of DSC and P-DSC techniques. Thermomicroscopy can also be used to study the cold flow behavior of biodiesel, giving information on the size and the morphology of crystals formed at low temperature.

  13. Loss of Desmocollin 3 in Skin Tumor Development and Progression

    PubMed Central

    Chen, Jiangli; O’Shea, Charlene; Fitzpatrick, James E.; Koster, Maranke I.; Koch, Peter J.

    2011-01-01

    Desmocollin 3 (DSC3) is a desmosomal cadherin that is required for maintaining cell adhesion in the epidermis as demonstrated by the intra-epidermal blistering observed in Dsc3 null skin. Recently, it has been suggested that deregulated expression of DSC3 occurs in certain human tumor types. It is not clear whether DSC3 plays a role in the development or progression of cancers arising in stratified epithelia such as the epidermis. To address this issue, we generated a mouse model in which Dsc3 expression is ablated in K-Ras oncogene-induced skin tumors. Our results demonstrate that loss of Dsc3 leads to an increase in K-Ras induced skin tumors. We hypothesize that acantholysis-induced epidermal hyperplasia in the Dsc3 null epidermis facilitates Ras-induced tumor development. Further, we demonstrate that spontaneous loss of DSC3 expression is a common occurrence during human and mouse skin tumor progression. This loss occurs in tumor cells invading the dermis. Interestingly, other desmosomal proteins are still expressed in tumor cells that lack DSC3, suggesting a specific function of DSC3 loss in tumor progression. While loss of DSC3 on the skin surface leads to epidermal blistering, it does not appear to induce loss of cell-cell adhesion in tumor cells invading the dermis, most likely due to a protection of these cells within the dermis from mechanical stress. We thus hypothesize that DSC3 can contribute to the progression of tumors both by cell adhesion-dependent (skin surface) and likely by cell adhesion-independent (invading tumor cells) mechanisms. PMID:21681825

  14. Desmocollin 2 is a new immunohistochemical marker indicative of squamous differentiation in urothelial carcinoma.

    PubMed

    Hayashi, Tetsutaro; Sentani, Kazuhiro; Oue, Naohide; Anami, Katsuhiro; Sakamoto, Naoya; Ohara, Shinya; Teishima, Jun; Noguchi, Tsuyoshi; Nakayama, Hirofumi; Taniyama, Kiyomi; Matsubara, Akio; Yasui, Wataru

    2011-10-01

    Urothelial carcinoma (UC) with squamous differentiation tends to present at higher stages than pure UC. To distinguish UC with squamous differentiation from pure UC, a sensitive and specific marker is needed. Desmocollin 2 (DSC2) is a protein localized in desmosomal junctions of stratified epithelium, but little is known about its biological significance in bladder cancer. We examined the utility of DSC2 as a diagnostic marker. We analysed the immunohistochemical characteristics of DSC2, and studied the relationship of DSC2 expression with the expression of the known markers uroplakin III (UPIII), cytokeratin (CK)7, CK20, epidermal growth factor receptor (EGFR), and p53. DSC2 staining was detected in 24 of 25 (96%) cases of UC with squamous differentiation, but in none of 85 (0%) cases of pure UC. DSC2 staining was detected only in areas of squamous differentiation. DSC2 expression was mutually exclusive of UPIII expression, and was correlated with EGFR expression. Furthermore, DSC2 expression was correlated with higher stage (P = 0.0314) and poor prognosis (P = 0.0477). DSC2 staining offers high sensitivity (96%) and high specificity (100%) for the detection of squamous differentiation in UC. DSC2 is a useful immunohistochemical marker for separation of UC with squamous differentiation from pure UC. 2011 Blackwell Publishing Limited.

  15. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  16. Occurrence of β-lactamase genes among non-Typhi Salmonella enterica isolated from humans, food animals, and retail meats in the United States and Canada.

    PubMed

    Sjölund-Karlsson, Maria; Howie, Rebecca L; Blickenstaff, Karen; Boerlin, Patrick; Ball, Takiyah; Chalmers, Gabhan; Duval, Brea; Haro, Jovita; Rickert, Regan; Zhao, Shaohua; Fedorka-Cray, Paula J; Whichard, Jean M

    2013-06-01

    Non-Typhi Salmonella cause over 1.7 million cases of gastroenteritis in North America each year, and food-animal products are commonly implicated in human infections. For invasive infections, antimicrobial therapy is indicated. In North America, the antimicrobial susceptibility of Salmonella is monitored by the U.S. National Antimicrobial Resistance Monitoring System (NARMS) and The Canadian Integrated Program for Antimicrobial Resistance Surveillance (CIPARS). In this study, we determined the susceptibility to cephalosporins by broth microdilution among 5,041 non-Typhi Salmonella enterica isolated from food animals, retail meats, and humans. In the United States, 109 (4.6%) of isolates collected from humans, 77 (15.7%) from retail meat, and 140 (10.6%) from food animals displayed decreased susceptibility to cephalosporins (DSC). Among the Canadian retail meat and food animal isolates, 52 (13.0%) and 42 (9.4%) displayed DSC. All isolates displaying DSC were screened for β-lactamase genes (bla(TEM), bla(SHV), bla(CMY), bla(CTX-M), and bla(OXA-1)) by polymerase chain reaction. At least one β-lactamase gene was detected in 74/109 (67.9%) isolates collected from humans, and the bla(CMY) genes were most prevalent (69/109; 63.3%). Similarly, the bla(CMY) genes predominated among the β-lactamase-producing isolates collected from retail meats and food animals. Three isolates from humans harbored a bla(CTX-M-15) gene. No animal or retail meat isolates harbored a bla(CTX-M) or bla(OXA-1) gene. A bla(TEM) gene was found in 5 human, 9 retail meat, and 17 animal isolates. Although serotype distributions varied among human, retail meat, and animal sources, overlap in bla(CMY)-positive serotypes across sample sources supports meat and food-animal sources as reservoirs for human infection.

  17. Clinical application of plasma thermograms. Utility, practical approaches and considerations.

    PubMed

    Garbett, Nichola C; Mekmaysy, Chongkham S; DeLeeuw, Lynn; Chaires, Jonathan B

    2015-04-01

    Differential scanning calorimetry (DSC) studies of blood plasma are part of an emerging area of the clinical application of DSC to biofluid analysis. DSC analysis of plasma from healthy individuals and patients with various diseases has revealed changes in the thermal profiles of the major plasma proteins associated with the clinical status of the patient. The sensitivity of DSC to the concentration of proteins, their interactions with other proteins or ligands, or their covalent modification underlies the potential utility of DSC analysis. A growing body of literature has demonstrated the versatility and performance of clinical DSC analysis across a range of biofluids and in a number of disease settings. The principles, practice and challenges of DSC analysis of plasma are described in this article. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Clinical application of plasma thermograms. Utility, practical approaches and considerations

    PubMed Central

    Garbett, Nichola C.; Mekmaysy, Chongkham S.; DeLeeuw, Lynn; Chaires, Jonathan B.

    2014-01-01

    Differential scanning calorimetry (DSC) studies of blood plasma are part of an emerging area of the clinical application of DSC to biofluid analysis. DSC analysis of plasma from healthy individuals and patients with various diseases has revealed changes in the thermal profiles of the major plasma proteins associated with the clinical status of the patient. The sensitivity of DSC to the concentration of proteins, their interactions with other proteins or ligands, or their covalent modifications underlies the potential utility of DSC analysis. A growing body of literature has demonstrated the versatility and performance of clinical DSC analysis across a range of biofluids and in a number of disease settings. The principles, practice and challenges of DSC analysis of plasma are described in this article. PMID:25448297

  19. Direct Ca2+-dependent Heterophilic Interaction between Desmosomal Cadherins, Desmoglein and Desmocollin, Contributes to Cell–Cell Adhesion

    PubMed Central

    Chitaev, Nikolai A.; Troyanovsky, Sergey M.

    1997-01-01

    Human fibrosarcoma cells, HT-1080, feature extensive adherens junctions, lack mature desmosomes, and express a single known desmosomal protein, Desmoglein 2 (Dsg2). Transfection of these cells with bovine Desmocollin 1a (Dsc1a) caused dramatic changes in the subcellular distribution of endogenous Dsg2. Both cadherins clustered in the areas of the adherens junctions, whereas only a minor portion of Dsg2 was seen in these areas in the parental cells. Deletion mapping showed that intact extracellular cadherin-like repeats of Dsc1a (Arg1-Thr170) are required for the translocation of Dsg2. Deletion of the intracellular C-domain that mediates the interaction of Dsc1a with plakoglobin, or the CSI region that is involved in the binding to desmoplakin, had no effect. Coimmunoprecipitation experiments of cell lysates stably expressing Dsc1a with anti-Dsc or -Dsg antibodies demonstrate that the desmosomal cadherins, Dsg2 and Dsc1a, are involved in a direct Ca2+-dependent interaction. This conclusion was further supported by the results of solid phase binding experiments. These showed that the Dsc1a fragment containing cadherin-like repeats 1 and 2 binds directly to the extracellular portion of Dsg in a Ca2+-dependent manner. The contribution of the Dsg/ Dsc interaction to cell–cell adhesion was tested by coculturing HT-1080 cells expressing Dsc1a with HT-1080 cells lacking Dsc but expressing myc-tagged plakoglobin (MPg). In the latter cells, MPg and the endogenous Dsg form stable complexes. The observed specific coimmunoprecipitation of MPg by anti-Dsc antibodies in coculture indicates that an intercellular interaction between Dsc1 and Dsg is involved in cell–cell adhesion. PMID:9214392

  20. Distinct Roles of the DmNav and DSC1 Channels in the Action of DDT and Pyrethroids

    PubMed Central

    Rinkevich, Frank D.; Du, Yuzhe; Tolinski, Josh; Ueda, Atsushi; Wu, Chun-Fang; Zhorov, Boris S.; Dong, Ke

    2015-01-01

    Voltage-gated sodium channels (Nav channels) are critical for electrical signaling in the nervous system and are the primary targets of the insecticides DDT and pyrethroids. In Drosophila melanogaster, besides the canonical Nav channel, Para (also called DmNav), there is a sodium channel-like cation channel called DSC1 (Drosophila sodium channel 1). Temperature-sensitive paralytic mutations in DmNav (parats) confer resistance to DDT and pyrethroids, whereas DSC1 knockout flies exhibit enhanced sensitivity to pyrethroids. To further define the roles and interaction of DmNav and DSC1 channels in DDT and pyrethroid neurotoxicology, we generated a DmNav/DSC1 double mutant line by introducing a parats1 allele (carrying the I265N mutation) into a DSC1 knockout line. We confirmed that the I265N mutation reduced the sensitivity to two pyrethroids, permethrin and deltamethrin of a DmNav variant expressed in Xenopus oocytes. Computer modeling predicts that the I265N mutation confers pyrethroid resistance by allosterically altering the second pyrethroid receptor site on the DmNav channel. Furthermore, we found that I265N-mediated pyrethroid resistance in parats1 mutant flies was almost completely abolished in parats1;DSC1−/− double mutant flies. Unexpectedly, however, the DSC1 knockout flies were less sensitive to DDT, compared to the control flies (w1118A), and the parats1;DSC1−/− double mutant flies were even more resistant to DDT compared to the DSC1 knockout or parats1 mutant. Our findings revealed distinct roles of the DmNav and DSC1 channels in the neurotoxicology of DDT vs. pyrethroids and implicate the exciting possibility of using DSC1 channel blockers or modifiers in the management of pyrethroid resistance. PMID:25687544

  1. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... calling (DSC) equipment has been verified by actual communications or a test call; (ii) The portable... devices which do not have integral navigation receivers, including: VHF DSC, MF DSC, satellite EPIRB and HF DSC or INMARSAT SES. On a ship without integral or directly connected navigation receiver input to...

  2. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... calling (DSC) equipment has been verified by actual communications or a test call; (ii) The portable... devices which do not have integral navigation receivers, including: VHF DSC, MF DSC, satellite EPIRB and HF DSC or INMARSAT SES. On a ship without integral or directly connected navigation receiver input to...

  3. Synthesis and Characterization of the First Liquid Single Source Precursors for the Deposition of Ternary Chalcopyrite (CuInS2) Thin Film Materials

    NASA Technical Reports Server (NTRS)

    Banger, Kulbinder K.; Cowen, Jonathan; Hepp, Aloysius

    2002-01-01

    Molecular engineering of ternary single source precursors based on the [{PBu3}2Cu(SR')2In(SR')2] architecture have afforded the first liquid CIS ternary single source precursors (when R = Et, n-Pr), which are suitable for low temperature deposition (< 350 C). Thermogravimetric analyses (TGA) and modulated-differential scanning calorimetry (DSC) confirm their liquid phase and reduced stability. X-ray diffraction studies, energy dispersive analyzer (EDS), and scanning electron microscopy (SEM) support the formation of the single-phase chalcopyrite CuInS2 at low temperatures.

  4. 47 CFR 80.359 - Frequencies for digital selective calling (DSC).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Frequencies for digital selective calling (DSC... for digital selective calling (DSC). (a) General purpose calling. The following table describes the calling frequencies for use by authorized ship and coast stations for general purpose DSC. There are three...

  5. Design and long-term monitoring of DSC/CIGS tandem solar module

    NASA Astrophysics Data System (ADS)

    Vildanova, M. F.; Nikolskaia, A. B.; Kozlov, S. S.; Shevaleevskiy, O. I.

    2015-11-01

    This paper describes the design and development of tandem dye-sensitized/Cu(In, Ga)Se (DSC/CIGS) PV modules. The tandem PV module comprised of the top DSC module and a bottom commercial 0,8 m2 CIGS module. The top DSC module was made of 10 DSC mini-modules with the field size of 20 × 20 cm2 each. Tandem DSC/CIGS PV modules were used for providing the long-term monitoring of energy yield and electrical parameters in comparison with standalone CIGS modules under outdoor conditions. The outdoor test facility, containing solar modules of both types and a measurement unit, was located on the roof of the Institute of Biochemical Physics in Moscow. The data obtained during monitoring within the 2014 year period has shown the advantages of the designed tandem DSC/CIGS PV-modules over the conventional CIGS modules, especially for cloudy weather and low-intensity irradiation conditions.

  6. Distinct roles of the DmNav and DSC1 channels in the action of DDT and pyrethroids.

    PubMed

    Rinkevich, Frank D; Du, Yuzhe; Tolinski, Josh; Ueda, Atsushi; Wu, Chun-Fang; Zhorov, Boris S; Dong, Ke

    2015-03-01

    Voltage-gated sodium channels (Nav channels) are critical for electrical signaling in the nervous system and are the primary targets of the insecticides DDT and pyrethroids. In Drosophila melanogaster, besides the canonical Nav channel, Para (also called DmNav), there is a sodium channel-like cation channel called DSC1 (Drosophila sodium channel 1). Temperature-sensitive paralytic mutations in DmNav (para(ts)) confer resistance to DDT and pyrethroids, whereas DSC1 knockout flies exhibit enhanced sensitivity to pyrethroids. To further define the roles and interaction of DmNav and DSC1 channels in DDT and pyrethroid neurotoxicology, we generated a DmNav/DSC1 double mutant line by introducing a para(ts1) allele (carrying the I265N mutation) into a DSC1 knockout line. We confirmed that the I265N mutation reduced the sensitivity to two pyrethroids, permethrin and deltamethrin of a DmNav variant expressed in Xenopus oocytes. Computer modeling predicts that the I265N mutation confers pyrethroid resistance by allosterically altering the second pyrethroid receptor site on the DmNav channel. Furthermore, we found that I265N-mediated pyrethroid resistance in para(ts1) mutant flies was almost completely abolished in para(ts1);DSC1(-/-) double mutant flies. Unexpectedly, however, the DSC1 knockout flies were less sensitive to DDT, compared to the control flies (w(1118A)), and the para(ts1);DSC1(-/-) double mutant flies were even more resistant to DDT compared to the DSC1 knockout or para(ts1) mutant. Our findings revealed distinct roles of the DmNav and DSC1 channels in the neurotoxicology of DDT vs. pyrethroids and implicate the exciting possibility of using DSC1 channel blockers or modifiers in the management of pyrethroid resistance. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Human decidual stromal cells secrete soluble pro-apoptotic factors during decidualization in a cAMP-dependent manner.

    PubMed

    Leno-Durán, E; Ruiz-Magaña, M J; Muñoz-Fernández, R; Requena, F; Olivares, E G; Ruiz-Ruiz, C

    2014-10-10

    Is there a relationship between decidualization and apoptosis of decidual stromal cells (DSC)? Decidualization triggers the secretion of soluble factors that induce apoptosis in DSC. The differentiation and apoptosis of DSC during decidualization of the receptive decidua are crucial processes for the controlled invasion of trophoblasts in normal pregnancy. Most DSC regress in a time-dependent manner, and their removal is important to provide space for the embryo to grow. However, the mechanism that controls DSC death is poorly understood. The apoptotic response of DSC was analyzed after exposure to different exogenous agents and during decidualization. The apoptotic potential of decidualized DSC supernatants and prolactin (PRL) was also evaluated. DSC lines were established from samples of decidua from first trimester pregnancies. Apoptosis was assayed by flow cytometry. PRL production, as a marker of decidualization, was determined by enzyme-linked immunosorbent assay. DSCs were resistant to a variety of apoptosis-inducing substances. Nevertheless, DSC underwent apoptosis during decidualization in culture, with cAMP being essential for both apoptosis and differentiation. In addition, culture supernatants from decidualized DSC induced apoptosis in undifferentiated DSC, although paradoxically these supernatants decreased the spontaneous apoptosis of decidual lymphocytes. Exogenously added PRL did not induce apoptosis in DSC and an antibody that neutralized the PRL receptor did not decrease the apoptosis induced by supernatants. Further studies are needed to examine the involvement of other soluble factors secreted by decidualized DSC in the induction of apoptosis. The present results indicate that apoptosis of DSC occurs in parallel to differentiation, in response to decidualization signals, with soluble factors secreted by decidualized DSC being responsible for triggering cell death. These studies are relevant in the understanding of how the regression of decidua, a crucial process for successful pregnancy, takes place. This work was supported by the Consejería de Economía, Innovación y Ciencia, Junta de Andalucía (Grant CTS-6183, Proyectos de Investigación de Excelencia 2010 to C.R.-R.) and the Instituto de Salud Carlos III, Ministerio de Economía y Competitividad, Spain (Grants PS09/00339 and PI12/01085 to E.G.O.). E.L.-D. was supported by fellowships from the Ministerio de Educación y Ciencia, Spain and the University of Granada. The authors have no conflict of interest. © The Author 2014. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  9. Characterization of organic matter of plants from lakes by thermal analysis in a N2 atmosphere

    NASA Astrophysics Data System (ADS)

    Guo, Fei; Wu, Fengchang; Mu, Yunsong; Hu, Yan; Zhao, Xiaoli; Meng, Wei; Giesy, John P.; Lin, Ying

    2016-03-01

    Organic matter (OM) has been characterized using thermal analysis in O2 atmospheres, but it is not clear if OM can be characterized using slow thermal degradation in N2 atmospheres (STDN). This article presents a new method to estimate the behavior of OM in anaerobic environment. Seventeen different plants from Tai Lake (Ch: Taihu), China were heated to 600 °C at a rate of 10 °C min-1 in a N2 atmosphere and characterized by use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA). DSC chromatograms were compared with 9 standard compounds. Seven peaks were observed in DSC chromatograms, 2 main peaks strongly correlated with biochemical indices, and one main peak was a transitional stage. Energy absorbed by a peak at approximately 200 °C and total organic carbon were well correlated, while energy absorbed at approximately 460 °C was negatively correlated with lignin content. Presence of peaks at approximately 350 and 420 °C varied among plant biomass sources, providing potential evidence for biomass identification. Methods of STDN reported here were rapid and accurate ways to quantitatively characterize OM, which may provide useful information for understanding anaerobic behaviors of natural organic matters.

  10. Changes of multi-scale structure during mimicked DSC heating reveal the nature of starch gelatinization

    NASA Astrophysics Data System (ADS)

    Wang, Shujun; Zhang, Xiu; Wang, Shuo; Copeland, Les

    2016-06-01

    A thorough understanding of starch gelatinization is extremely important for precise control of starch functional properties for food processing and human nutrition. Here we reveal the molecular mechanism of starch gelatinization by differential scanning calorimetry (DSC) in conjunction with a protocol using the rapid viscosity analyzer (RVA) to generate material for analysis under conditions that simulated the DSC heating profiles. The results from DSC, FTIR, Raman, X-ray diffraction and small angle X-ray scattering (SAXS) analyses all showed that residual structural order remained in starch that was heated to the DSC endotherm end temperature in starch:water mixtures of 0.5 to 4:1 (v/w). We conclude from this study that the DSC endotherm of starch at a water:starch ratio of 2 to 4 (v/w) does not represent complete starch gelatinization. The DSC endotherm of starch involves not only the water uptake and swelling of amorphous regions, but also the melting of starch crystallites.

  11. Changes of multi-scale structure during mimicked DSC heating reveal the nature of starch gelatinization

    PubMed Central

    Wang, Shujun; Zhang, Xiu; Wang, Shuo; Copeland, Les

    2016-01-01

    A thorough understanding of starch gelatinization is extremely important for precise control of starch functional properties for food processing and human nutrition. Here we reveal the molecular mechanism of starch gelatinization by differential scanning calorimetry (DSC) in conjunction with a protocol using the rapid viscosity analyzer (RVA) to generate material for analysis under conditions that simulated the DSC heating profiles. The results from DSC, FTIR, Raman, X-ray diffraction and small angle X-ray scattering (SAXS) analyses all showed that residual structural order remained in starch that was heated to the DSC endotherm end temperature in starch:water mixtures of 0.5 to 4:1 (v/w). We conclude from this study that the DSC endotherm of starch at a water:starch ratio of 2 to 4 (v/w) does not represent complete starch gelatinization. The DSC endotherm of starch involves not only the water uptake and swelling of amorphous regions, but also the melting of starch crystallites. PMID:27319782

  12. Comparative study of pulsed-continuous arterial spin labeling and dynamic susceptibility contrast imaging by histogram analysis in evaluation of glial tumors.

    PubMed

    Arisawa, Atsuko; Watanabe, Yoshiyuki; Tanaka, Hisashi; Takahashi, Hiroto; Matsuo, Chisato; Fujiwara, Takuya; Fujiwara, Masahiro; Fujimoto, Yasunori; Tomiyama, Noriyuki

    2018-06-01

    Arterial spin labeling (ASL) is a non-invasive perfusion technique that may be an alternative to dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) for assessment of brain tumors. To our knowledge, there have been no reports on histogram analysis of ASL. The purpose of this study was to determine whether ASL is comparable with DSC-MRI in terms of differentiating high-grade and low-grade gliomas by evaluating the histogram analysis of cerebral blood flow (CBF) in the entire tumor. Thirty-four patients with pathologically proven glioma underwent ASL and DSC-MRI. High-signal areas on contrast-enhanced T 1 -weighted images or high-intensity areas on fluid-attenuated inversion recovery images were designated as the volumes of interest (VOIs). ASL-CBF, DSC-CBF, and DSC-cerebral blood volume maps were constructed and co-registered to the VOI. Perfusion histogram analyses of the whole VOI and statistical analyses were performed to compare the ASL and DSC images. There was no significant difference in the mean values for any of the histogram metrics in both of the low-grade gliomas (n = 15) and the high-grade gliomas (n = 19). Strong correlations were seen in the 75th percentile, mean, median, and standard deviation values between the ASL and DSC images. The area under the curve values tended to be greater for the DSC images than for the ASL images. DSC-MRI is superior to ASL for distinguishing high-grade from low-grade glioma. ASL could be an alternative evaluation method when DSC-MRI cannot be used, e.g., in patients with renal failure, those in whom repeated examination is required, and in children.

  13. Systematic review of the accuracy of dual-source cardiac CT for detection of arterial stenosis in difficult to image patient groups.

    PubMed

    Westwood, Marie E; Raatz, Heike D I; Misso, Kate; Burgers, Laura; Redekop, Ken; Lhachimi, Stefan K; Armstrong, Nigel; Kleijnen, Jos

    2013-05-01

    To assess the diagnostic performance of dual-source cardiac (DSC) computed tomography (CT) newer-generation CT instruments for identifying anatomically significant coronary artery disease (CAD) in patients who are difficult to image by using 64-section CT. A literature search comprised bibliographic databases (January 1, 2000, to March 22, 2011, with a pragmatic update on September 6, 2012), trial registries, and conference proceedings. Only studies using invasive coronary angiography as reference standard were included. Risk of bias was assessed (QUADAS-2). Results were stratified according to patient group on the basis of clinical characteristics. Summary estimates of sensitivity and specificity of DSC CT for detecting 50% or greater arterial stenosis were calculated by using a bivariate summary receiver operating characteristic or random-effects model. Twenty-five studies reported accuracy of DSC CT for diagnosing CAD in difficult to image patients; in 22 studies, one of two CT units of the same manufacturer (Somatom Definition or Somatom Definition Flash) was used, and in the remaining three, a different CT unit of another manufacturer (Aquilion One) was used. The pooled, per-patient estimates of sensitivity were 97.7% (95% confidence interval [CI]: 88.0%, 99.9%) and 97.7% (95% CI: 93.2%, 99.3%) for patients with arrhythmias and high heart rates, respectively. The corresponding pooled estimates of specificity were 81.7% (95% CI: 71.6%, 89.4%) and 86.3% (95% CI: 80.2%, 90.7%), respectively. All data were acquired by using Somatom Definition. In two studies with Somatom and one study with Aquilion One, sensitivity estimates of 90% or greater were reported in patients with previous stent implantations; specificities were 81.7% and 89.5% for Somatom and 81.0% for Aquilion One. In patients with high coronary calcium scores, previous bypass grafts, or obesity, only per-segment or per-artery data were available. Sensitivity estimates remained high (>90% in all but one study), and specificities ranged from 79.1% to 100%. All data were acquired by using Somatom Definition. DSC CT may be sufficiently accurate to diagnose clinically significant CAD in some or all difficult to image patients. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.13121136/-/DC1. © RSNA, 2013.

  14. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    PubMed

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  15. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  16. Transforming Benzophenoxazine Laser Dyes into Chromophores for Dye-Sensitized Solar Cells: A Molecular Engineering Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schröder, Florian A. Y. N.; Cole, Jacqueline M.; Waddell, Paul G.

    2015-02-03

    The re-functionalization of a series of four well-known industrial laser dyes, based on benzophenoxazine, is explored with the prospect of molecularly engineering new chromophores for dye-sensitized solar cell (DSC) applications. Such engineering is important since a lack of suitable dyes is stifling the progress of DSC technology. The conceptual idea involves making laser dyes DSC-active by chemical modification, while maintaining their key property attributes that are attractive to DSC applications. This molecular engineering follows a step-wise approach. Firstly, molecular structures and optical absorption properties are determined for the parent laser dyes: Cresyl Violet (1); Oxazine 170 (2); Nile Blue Amore » (3), Oxazine 750 (4). These reveal structure-property relationships which define the prerequisites for computational molecular design of DSC dyes; the nature of their molecular architecture (D-π-A) and intramolecular charge transfer. Secondly, new DSC dyes are computationally designed by the in silico addition of a carboxylic acid anchor at various chemical substitution points in the parent laser dyes. A comparison of the resulting frontier molecular orbital energy levels with the conduction band edge of a TiO2 DSC photoanode and the redox potential of two electrolyte options I-/I3- and Co(II/III)tris(bipyridyl) suggests promise for these computationally designed dyes as co-sensitizers for DSC applications.« less

  17. Media Coverage of FDA Drug Safety Communications about Zolpidem: A Quantitative and Qualitative Analysis.

    PubMed

    Woloshin, Steve; Schwartz, Lisa M; Dejene, Sara; Rausch, Paula; Dal Pan, Gerald J; Zhou, Esther H; Kesselheim, Aaron S

    2017-05-01

    FDA issues Drug Safety Communications (DSCs) to alert health care professionals and the public about emerging safety information affecting prescription and over-the-counter drugs. News media may amplify DSCs, but it is unclear how DSC messaging is transmitted through the media. We conducted a content analysis of the lay media coverage reaching the broadest audience to characterize the amount and content of media coverage of two zolpidem DSCs from 2013. After the first DSC, zolpidem news stories increased from 19 stories/week in the preceding 3 months to 153 following its release. Most (81%) appeared in the lay media, and 64% focused on the DSC content. After the second DSC, news stories increased from 24 stories/week in the preceding 3 months to 39 following. Among the 100 unique lay media news stories, at least half correctly reported three key DSC messages: next-day impairment and drowsiness as common safety hazards, lower doses for some but not all zolpidem products, and women's higher risk for impairment. Other DSC messages were reported in fewer than one-third of stories, such as the warning that impairment can happen even when people feel fully awake. The first-but not the second-zolpidem DSC generated high-profile news coverage. The finding that some messages were widely reported but others were not emphasizes the importance of ensuring translation of key DSC content.

  18. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  19. Center Variation and Outcomes Associated with Delayed Sternal Closure Following Stage 1 Palliation for Hypoplastic Left Heart Syndrome

    PubMed Central

    Johnson, Jason N.; Jaggers, James; Li, Shuang; O’Brien, Sean M.; Li, Jennifer S.; Jacobs, Jeffrey P.; Jacobs, Marshall L.; Welke, Karl F.; Peterson, Eric D.; Pasquali, Sara K.

    2009-01-01

    Objectives There is debate whether primary or delayed sternal closure (DSC) is the best strategy following Stage 1 palliation (S1P) for hypoplastic left heart syndrome (HLHS). We describe center variation in DSC following S1P and associated outcomes. Methods Society of Thoracic Surgeons Congenital Database participants performing S1P for HLHS from 2000–2007 were included. We examined center variation in DSC, and compared in-hospital mortality, prolonged length of stay (LOS >6wks), and postoperative infection in centers with low (≤25% of cases), middle (26%–74% of cases), and high (≥75% of cases) DSC utilization, adjusting for patient and center factors. Results There were 1283 patients (45 centers) included. Median age and weight at surgery were 6d (IQR4-9d) and 3.2 kg (IQR2.8–3.5kg); 59% were male. DSC was used in 74% (range 3–100% of cases/center). In centers with high (n=23) and middle (n=17) vs. low (n=5) DSC utilization, there was a greater proportion of patients with prolonged LOS and infection, and a trend toward increased in-hospital mortality in unadjusted analysis. In multivariable analysis, there was no difference in mortality. Centers with high and middle DSC utilization had prolonged LOS [OR (95%CI): 2.83(1.46–5.47) p=0.002 and 2.23(1.17–4.26) p=0.02] and more infection [2.34(1.20–4.57) p=0.01 and 2.37(1.36–4.16) p=0.003]. Conclusions Utilization of DSC following S1P varies widely. These observational data suggest more frequent use of DSC is associated with longer LOS and higher postoperative infection rates. Further evaluation of the risks and benefits of DSC in the management of these complex infants is necessary. PMID:20167337

  20. Joint source-channel coding for motion-compensated DCT-based SNR scalable video.

    PubMed

    Kondi, Lisimachos P; Ishtiaq, Faisal; Katsaggelos, Aggelos K

    2002-01-01

    In this paper, we develop an approach toward joint source-channel coding for motion-compensated DCT-based scalable video coding and transmission. A framework for the optimal selection of the source and channel coding rates over all scalable layers is presented such that the overall distortion is minimized. The algorithm utilizes universal rate distortion characteristics which are obtained experimentally and show the sensitivity of the source encoder and decoder to channel errors. The proposed algorithm allocates the available bit rate between scalable layers and, within each layer, between source and channel coding. We present the results of this rate allocation algorithm for video transmission over a wireless channel using the H.263 Version 2 signal-to-noise ratio (SNR) scalable codec for source coding and rate-compatible punctured convolutional (RCPC) codes for channel coding. We discuss the performance of the algorithm with respect to the channel conditions, coding methodologies, layer rates, and number of layers.

  1. Implementation of DSC model and application for analysis of field pile tests under cyclic loading

    NASA Astrophysics Data System (ADS)

    Shao, Changming; Desai, Chandra S.

    2000-05-01

    The disturbed state concept (DSC) model, and a new and simplified procedure for unloading and reloading behavior are implemented in a nonlinear finite element procedure for dynamic analysis for coupled response of saturated porous materials. The DSC model is used to characterize the cyclic behavior of saturated clays and clay-steel interfaces. In the DSC, the relative intact (RI) behavior is characterized by using the hierarchical single surface (HISS) plasticity model; and the fully adjusted (FA) behavior is modeled by using the critical state concept. The DSC model is validated with respect to laboratory triaxial tests for clay and shear tests for clay-steel interfaces. The computer procedure is used to predict field behavior of an instrumented pile subjected to cyclic loading. The predictions provide very good correlation with the field data. They also yield improved results compared to those from a HISS model with anisotropic hardening, partly because the DSC model allows for degradation or softening and interface response.

  2. Probing Protein Sequences as Sources for Encrypted Antimicrobial Peptides

    PubMed Central

    Brand, Guilherme D.; Magalhães, Mariana T. Q.; Tinoco, Maria L. P.; Aragão, Francisco J. L.; Nicoli, Jacques; Kelly, Sharon M.; Cooper, Alan; Bloch, Carlos

    2012-01-01

    Starting from the premise that a wealth of potentially biologically active peptides may lurk within proteins, we describe here a methodology to identify putative antimicrobial peptides encrypted in protein sequences. Candidate peptides were identified using a new screening procedure based on physicochemical criteria to reveal matching peptides within protein databases. Fifteen such peptides, along with a range of natural antimicrobial peptides, were examined using DSC and CD to characterize their interaction with phospholipid membranes. Principal component analysis of DSC data shows that the investigated peptides group according to their effects on the main phase transition of phospholipid vesicles, and that these effects correlate both to antimicrobial activity and to the changes in peptide secondary structure. Consequently, we have been able to identify novel antimicrobial peptides from larger proteins not hitherto associated with such activity, mimicking endogenous and/or exogenous microorganism enzymatic processing of parent proteins to smaller bioactive molecules. A biotechnological application for this methodology is explored. Soybean (Glycine max) plants, transformed to include a putative antimicrobial protein fragment encoded in its own genome were tested for tolerance against Phakopsora pachyrhizi, the causative agent of the Asian soybean rust. This procedure may represent an inventive alternative to the transgenic technology, since the genetic material to be used belongs to the host organism and not to exogenous sources. PMID:23029273

  3. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  4. Gelatinisation kinetics of corn and chickpea starches using DSC, RVA, and dynamic rheometry

    USDA-ARS?s Scientific Manuscript database

    The gelatinisation kinetics (non-isothermal) of corn and chickpea starches at different heating rates were calculated using differential scanning calorimetry (DSC), rapid visco analyser (RVA), and oscillatory dynamic rheometry. The data obtained from the DSC thermogram and the RVA profiles were fitt...

  5. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  6. 47 CFR 80.225 - Requirements for selective calling equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... selective calling (DSC) equipment and selective calling equipment installed in ship and coast stations, and...-STD, “RTCM Recommended Minimum Standards for Digital Selective Calling (DSC) Equipment Providing... Class ‘D’ Digital Selective Calling (DSC)—Methods of testing and required test results,” March 2003. ITU...

  7. 47 CFR 80.359 - Frequencies for digital selective calling (DSC).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Frequencies for digital selective calling (DSC... for digital selective calling (DSC). (a) General purpose calling. The following table describes the... Digital Selective-Calling Equipment in the Maritime Mobile Service,” with Annexes 1 through 5, 2004, and...

  8. Schroedinger’s code: Source code availability and transparency in astrophysics

    NASA Astrophysics Data System (ADS)

    Ryan, PW; Allen, Alice; Teuben, Peter

    2018-01-01

    Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.

  9. Patient and Physician Perceptions of Drug Safety Information for Sleep Aids: A Qualitative Study.

    PubMed

    Kesselheim, Aaron S; McGraw, Sarah A; Dejene, Sara Z; Rausch, Paula; Dal Pan, Gerald J; Lappin, Brian M; Zhou, Esther H; Avorn, Jerry; Campbell, Eric G

    2017-06-01

    The US Food and Drug Administration uses drug safety communications (DSCs) to release emerging information regarding post-market safety issues, but it is unclear the extent of awareness by patients and providers of these communications and their specific recommendations. We conducted semi-structured interviews with patients and physicians to evaluate their awareness and understanding of emerging drug safety information related to two sleep aids: zolpidem or eszopiclone. We conducted interviews with 40 patients and ten physicians recruited from a combination of insurer claims databases and online sources. We evaluated (1) sources of drug safety information; (2) discussions between patients and physicians about the two medications; (3) their knowledge of the DSC; and (4) preferences for learning about future drug safety information. Interviews were transcribed and analyzed thematically. Patients cited their physicians, pharmacy inserts, and the Internet as sources of drug safety information. Physicians often referred to medical journals and online medical sources. Most patients reported being aware of information contained in the DSC summaries they were read. Almost all patients and physicians reported discussing side effects during patient-provider conversations, but almost no patients mentioned that physicians had communicated with them key messaging from the DSCs at issue: the risk of next-morning impairment with zolpidem and the lower recommended initial dose for women. Some risks of medications are effectively communicated to patients and physicians; however, there is still a noticeable gap between information issued by the Food and Drug Administration and patient and physician awareness of this knowledge, as well as patients' decisions to act on this information. Disseminators of emerging drug safety information should explore ways of providing user-friendly resources to patients and healthcare professionals that can update them on new risks in a timely manner.

  10. Comparison of the transformation temperatures of heat-activated Nickel-Titanium orthodontic archwires by two different techniques.

    PubMed

    Obaisi, Noor Aminah; Galang-Boquiren, Maria Therese S; Evans, Carla A; Tsay, Tzong Guang Peter; Viana, Grace; Berzins, David; Megremis, Spiro

    2016-07-01

    The purpose of this study was to investigate the suitability of the Bend and Free Recovery (BFR) method as a standard test method to determine the transformation temperatures of heat-activated Ni-Ti orthodontic archwires. This was done by determining the transformation temperatures of two brands of heat-activated Ni-Ti orthodontic archwires using the both the BFR method and the standard method of Differential Scanning Calorimetry (DSC). The values obtained from the two methods were compared with each other and to the manufacturer-listed values. Forty heat-activated Ni-Ti archwires from both Rocky Mountain Orthodontics (RMO) and Opal Orthodontics (Opal) were tested using BFR and DSC. Round (0.016 inches) and rectangular (0.019×0.025 inches) archwires from each manufacturer were tested. The austenite start temperatures (As) and austenite finish temperatures (Af) were recorded. For four of the eight test groups, the BFR method resulted in lower standard deviations than the DSC method, and, overall, the average standard deviation for BFR testing was slightly lower than for DSC testing. Statistically significant differences were seen between the transformation temperatures obtained from the BFR and DSC test methods. However, the Af temperatures obtained from the two methods were remarkably similar with the mean differences ranging from 0.0 to 2.1°C: Af Opal round (BFR 26.7°C, DSC 27.6°C) and rectangular (BFR 27.6°C, DSC 28.6°C); Af RMO round (BFR 25.5°C, DSC 25.5°C) and rectangular (BFR 28.0°C, DSC 25.9°C). Significant differences were observed between the manufacturer-listed transformation temperatures and those obtained with BFR and DSC testing for both manufacturers. The results of this study suggest that the Bend and Free Recovery method is suitable as a standard method to evaluate the transformation temperatures of heat-activated Ni-Ti orthodontic archwires. Copyright © 2016 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  11. Assessment of accuracy and efficiency of atlas-based autosegmentation for prostate radiotherapy in a variety of clinical conditions.

    PubMed

    Simmat, I; Georg, P; Georg, D; Birkfellner, W; Goldner, G; Stock, M

    2012-09-01

    The goal of the current study was to evaluate the commercially available atlas-based autosegmentation software for clinical use in prostate radiotherapy. The accuracy was benchmarked against interobserver variability. A total of 20 planning computed tomographs (CTs) and 10 cone-beam CTs (CBCTs) were selected for prostate, rectum, and bladder delineation. The images varied regarding to individual (age, body mass index) and setup parameters (contrast agent, rectal balloon, implanted markers). Automatically created contours with ABAS(®) and iPlan(®) were compared to an expert's delineation by calculating the Dice similarity coefficient (DSC) and conformity index. Demo-atlases of both systems showed different results for bladder (DSC(ABAS) 0.86 ± 0.17, DSC(iPlan) 0.51 ± 0.30) and prostate (DSC(ABAS) 0.71 ± 0.14, DSC(iPlan) 0.57 ± 0.19). Rectum delineation (DSC(ABAS) 0.78 ± 0.11, DSC(iPlan) 0.84 ± 0.08) demonstrated differences between the systems but better correlation of the automatically drawn volumes. ABAS(®) was closest to the interobserver benchmark. Autosegmentation with iPlan(®), ABAS(®) and manual segmentation took 0.5, 4 and 15-20 min, respectively. Automatic contouring on CBCT showed high dependence on image quality (DSC bladder 0.54, rectum 0.42, prostate 0.34). For clinical routine, efforts are still necessary to either redesign algorithms implemented in autosegmentation or to optimize image quality for CBCT to guarantee required accuracy and time savings for adaptive radiotherapy.

  12. 7 CFR 1710.114 - TIER, DSC, OTIER and ODSC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false TIER, DSC, OTIER and ODSC requirements. 1710.114... AND GUARANTEES Loan Purposes and Basic Policies § 1710.114 TIER, DSC, OTIER and ODSC requirements. (a) General. Requirements for coverage ratios are set forth in the borrower's mortgage, loan contract, or...

  13. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  14. Applications of high pressure differential scanning calorimetry to aviation fuel thermal stability research

    NASA Technical Reports Server (NTRS)

    Neveu, M. C.; Stocker, D. P.

    1985-01-01

    High pressure differential scanning calorimetry (DSC) was studied as an alternate method for performing high temperature fuel thermal stability research. The DSC was used to measure the heat of reaction versus temperature of a fuel sample heated at a programmed rate in an oxygen pressurized cell. Pure hydrocarbons and model fuels were studied using typical DSC operating conditions of 600 psig of oxygen and a temperature range from ambient to 500 C. The DSC oxidation onset temperature was determined and was used to rate the fuels on thermal stability. Kinetic rate constants were determined for the global initial oxidation reaction. Fuel deposit formation is measured, and the high temperature volatility of some tetralin deposits is studied by thermogravimetric analysis. Gas chromatography and mass spectrometry are used to study the chemical composition of some DSC stressed fuels.

  15. Combining Diffusion Tensor Metrics and DSC Perfusion Imaging: Can It Improve the Diagnostic Accuracy in Differentiating Tumefactive Demyelination from High-Grade Glioma?

    PubMed

    Hiremath, S B; Muraleedharan, A; Kumar, S; Nagesh, C; Kesavadas, C; Abraham, M; Kapilamoorthy, T R; Thomas, B

    2017-04-01

    Tumefactive demyelinating lesions with atypical features can mimic high-grade gliomas on conventional imaging sequences. The aim of this study was to assess the role of conventional imaging, DTI metrics ( p:q tensor decomposition), and DSC perfusion in differentiating tumefactive demyelinating lesions and high-grade gliomas. Fourteen patients with tumefactive demyelinating lesions and 21 patients with high-grade gliomas underwent brain MR imaging with conventional, DTI, and DSC perfusion imaging. Imaging sequences were assessed for differentiation of the lesions. DTI metrics in the enhancing areas and perilesional hyperintensity were obtained by ROI analysis, and the relative CBV values in enhancing areas were calculated on DSC perfusion imaging. Conventional imaging sequences had a sensitivity of 80.9% and specificity of 57.1% in differentiating high-grade gliomas ( P = .049) from tumefactive demyelinating lesions. DTI metrics ( p : q tensor decomposition) and DSC perfusion demonstrated a statistically significant difference in the mean values of ADC, the isotropic component of the diffusion tensor, the anisotropic component of the diffusion tensor, the total magnitude of the diffusion tensor, and rCBV among enhancing portions in tumefactive demyelinating lesions and high-grade gliomas ( P ≤ .02), with the highest specificity for ADC, the anisotropic component of the diffusion tensor, and relative CBV (92.9%). Mean fractional anisotropy values showed no significant statistical difference between tumefactive demyelinating lesions and high-grade gliomas. The combination of DTI and DSC parameters improved the diagnostic accuracy (area under the curve = 0.901). Addition of a heterogeneous enhancement pattern to DTI and DSC parameters improved it further (area under the curve = 0.966). The sensitivity increased from 71.4% to 85.7% after the addition of the enhancement pattern. DTI and DSC perfusion add profoundly to conventional imaging in differentiating tumefactive demyelinating lesions and high-grade gliomas. The combination of DTI metrics and DSC perfusion markedly improved diagnostic accuracy. © 2017 by American Journal of Neuroradiology.

  16. Theoretical Aspects of Differential Scanning Calorimetry as a Tool for the Studies of Equilibrium Thermodynamics in Pharmaceutical Solid Phase Transitions.

    PubMed

    Faroongsarng, Damrongsak

    2016-06-01

    Although differential scanning calorimetry (DSC) is a non-equilibrium technique, it has been used to gain energetic information that involves phase equilibria. DSC has been widely used to characterize the equilibrium melting parameters of small organic pharmaceutical compounds. An understanding of how DSC measures an equilibrium event could make for a better interpretation of the results. The aim of this mini-review was to provide a theoretical insight into the DSC measurement to obtain the equilibrium thermodynamics of a phase transition especially the melting process. It was demonstrated that the heat quantity obtained from the DSC thermogram (ΔH) was related to the thermodynamic enthalpy of the phase transition (ΔH (P) ) via: ΔH = ΔH (P) /(1 + K (- 1)) where K was the equilibrium constant. In melting, the solid and liquefied phases presumably coexist resulting in a null Gibbs free energy that produces an infinitely larger K. Thus, ΔH could be interpreted as ΔH (P). Issues of DSC investigations on melting behavior of crystalline solids including polymorphism, degradation impurity due to heating in situ, and eutectic melting were discussed. In addition, DSC has been a tool for determination of the impurity based on an ideal solution of the melt that is one of the official methods used to establish the reference standard.

  17. Operational rate-distortion performance for joint source and channel coding of images.

    PubMed

    Ruf, M J; Modestino, J W

    1999-01-01

    This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.

  18. 47 CFR 80.1087 - Ship radio equipment-Sea area A1.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... an INMARSAT ship earth station capable of two way communication. (b) The VHF radio installation... which the ship is normally navigated, operating either: (1) On VHF using DSC; or (2) Through the polar... voyages within coverage of MF coast stations equipped with DSC; or (4) On HF using DSC; or (5) Through the...

  19. 47 CFR 80.1087 - Ship radio equipment-Sea area A1.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... an INMARSAT ship earth station capable of two way communication. (b) The VHF radio installation... which the ship is normally navigated, operating either: (1) On VHF using DSC; or (2) Through the polar... voyages within coverage of MF coast stations equipped with DSC; or (4) On HF using DSC; or (5) Through the...

  20. 47 CFR 80.1087 - Ship radio equipment-Sea area A1.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... an INMARSAT ship earth station capable of two way communication. (b) The VHF radio installation... which the ship is normally navigated, operating either: (1) On VHF using DSC; or (2) Through the polar... voyages within coverage of MF coast stations equipped with DSC; or (4) On HF using DSC; or (5) Through the...

  1. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  2. Potential for differentiation of pseudoprogression from true tumor progression with dynamic susceptibility-weighted contrast-enhanced magnetic resonance imaging using ferumoxytol versus gadoteridol: A pilot study

    PubMed Central

    Gahramanov, Seymur; Raslan, Ahmed; Muldoon, Leslie L.; Hamilton, Bronwyn E.; Rooney, William D.; Varallyay, Csanad G.; Njus, Jeffrey M.; Haluska, Marianne; Neuwelt, Edward A.

    2010-01-01

    Purpose We evaluated dynamic susceptibility-weighted contrast-enhanced magnetic resonance imaging (DSC-MRI) using gadoteridol in comparison to the iron oxide nanoparticle blood pool agent, ferumoxytol in patients with glioblastoma multiforme (GBM) who received standard radiochemotherapy (RCT). Methods and Materials Fourteen patients with GBM received standard RCT and underwent 19 MRI sessions that included DSC-MRI acquisitions with gadoteridol on day 1 and ferumoxytol on day 2. Relative cerebral blood volume (rCBV) values were calculated from DSC data obtained from each contrast agent. T1-weighted acquisition post-gadoteridol administration was used to identify enhancing regions. Results In 7 MRI sessions of clinically presumptive active tumor, gadoteridol-DSC showed low rCBV in 3 and high rCBV in 4, while ferumoxytol-DSC showed high rCBV in all 7 sessions (p=0.002). After RCT, 7 MRI sessions showed increased gadoteridol contrast enhancement on T1-weighted scans coupled with low rCBV without significant differences between contrast agents (p=0.9). Based on post-gadoteridol T1-weighted scans, DSC-MRI, and clinical presentation four patterns of response to RCT were observed: 1) regression, 2) pseudoprogression, 3) true progression, and 4) mixed response. Conclusion We conclude that DSC-MRI with a blood-pool agent such as ferumoxytol may provide a better monitor of tumor rCBV than DSC-MRI with gadoteridol. Lesions demonstrating increased enhancement on T1-weighted MRI coupled with low ferumoxytol rCBV, are likely exhibiting pseudoprogression, while high rCBV with ferumoxytol is a better marker than gadoteridol for determining active tumor. These interesting pilot observations suggest that ferumoxytol may differentiate tumor progression from pseudoprogression, and warrant further investigation. PMID:20395065

  3. Dye-sensitized solar cells consisting of dye-bilayer structure stained with two dyes for harvesting light of wide range of wavelength

    NASA Astrophysics Data System (ADS)

    Inakazu, Fumi; Noma, Yusuke; Ogomi, Yuhei; Hayase, Shuzi

    2008-09-01

    Dye-sensitized solar cells (DSCs) containing dye-bilayer structure of black dye and NK3705 (3-carboxymethyl-5-[3-(4-sulfobutyl)-2(3H)-bezothiazolylidene]-2-thioxo-4-thiazolidinone, sodium salt) in one TiO2 layer (2-TiO-BD-NK) are reported. The 2-TiO-BD-NK structure was fabricated by staining one TiO2 layer with these two dyes, step by step, under a pressurized CO2 condition. The dye-bilayer structure was observed by using a confocal laser scanning microscope. The short circuit current (Jsc) and the incident photon to current efficiency of the cell (DSC-2-TiO-BD-NK) was almost the sum of those of DSC stained with black dye only (DSC-1-TiO-BD) and DSC stained with NK3705 only (DSC-1-TiO-NK).

  4. Measuring diagnoses: ICD code accuracy.

    PubMed

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-10-01

    To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.

  5. An elegant access to formation and vaporization enthalpies of ionic liquids by indirect DSC experiment and "in silico" calculations.

    PubMed

    Verevkin, Sergey P; Zaitsau, Dzmitry H; Emel'yanenko, Vladimir N; Schick, Christoph; Jayaraman, Saivenkataraman; Maginn, Edward J

    2012-07-14

    We used DSC for determination of the reaction enthalpy of the synthesis of the ionic liquid [C(4)mim][Cl]. A combination of DSC and quantum chemical calculations presents a new, indirect way to study thermodynamics of ionic liquids. The new procedure was validated with two direct experimental measurements and MD simulations.

  6. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  7. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  8. You've Written a Cool Astronomy Code! Now What Do You Do with It?

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Accomazzi, A.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P. J.; Wallin, J. F.

    2014-01-01

    Now that you've written a useful astronomy code for your soon-to-be-published research, you have to figure out what you want to do with it. Our suggestion? Share it! This presentation highlights the means and benefits of sharing your code. Make your code citable -- submit it to the Astrophysics Source Code Library and have it indexed by ADS! The Astrophysics Source Code Library (ASCL) is a free online registry of source codes of interest to astronomers and astrophysicists. With over 700 codes, it is continuing its rapid growth, with an average of 17 new codes a month. The editors seek out codes for inclusion; indexing by ADS improves the discoverability of codes and provides a way to cite codes as separate entries, especially codes without papers that describe them.

  9. Homoacetogenesis in Deep-Sea Chloroflexi, as Inferred by Single-Cell Genomics, Provides a Link to Reductive Dehalogenation in Terrestrial Dehalococcoidetes

    PubMed Central

    Sewell, Holly L.; Kaster, Anne-Kristin

    2017-01-01

    ABSTRACT The deep marine subsurface is one of the largest unexplored biospheres on Earth and is widely inhabited by members of the phylum Chloroflexi. In this report, we investigated genomes of single cells obtained from deep-sea sediments of the Peruvian Margin, which are enriched in such Chloroflexi. 16S rRNA gene sequence analysis placed two of these single-cell-derived genomes (DscP3 and Dsc4) in a clade of subphylum I Chloroflexi which were previously recovered from deep-sea sediment in the Okinawa Trough and a third (DscP2-2) as a member of the previously reported DscP2 population from Peruvian Margin site 1230. The presence of genes encoding enzymes of a complete Wood-Ljungdahl pathway, glycolysis/gluconeogenesis, a Rhodobacter nitrogen fixation (Rnf) complex, glyosyltransferases, and formate dehydrogenases in the single-cell genomes of DscP3 and Dsc4 and the presence of an NADH-dependent reduced ferredoxin:NADP oxidoreductase (Nfn) and Rnf in the genome of DscP2-2 imply a homoacetogenic lifestyle of these abundant marine Chloroflexi. We also report here the first complete pathway for anaerobic benzoate oxidation to acetyl coenzyme A (CoA) in the phylum Chloroflexi (DscP3 and Dsc4), including a class I benzoyl-CoA reductase. Of remarkable evolutionary significance, we discovered a gene encoding a formate dehydrogenase (FdnI) with reciprocal closest identity to the formate dehydrogenase-like protein (complex iron-sulfur molybdoenzyme [CISM], DET0187) of terrestrial Dehalococcoides/Dehalogenimonas spp. This formate dehydrogenase-like protein has been shown to lack formate dehydrogenase activity in Dehalococcoides/Dehalogenimonas spp. and is instead hypothesized to couple HupL hydrogenase to a reductive dehalogenase in the catabolic reductive dehalogenation pathway. This finding of a close functional homologue provides an important missing link for understanding the origin and the metabolic core of terrestrial Dehalococcoides/Dehalogenimonas spp. and of reductive dehalogenation, as well as the biology of abundant deep-sea Chloroflexi. PMID:29259088

  10. "Sticky electrons" transport and interfacial transfer of electrons in the dye-sensitized solar cell.

    PubMed

    Peter, Laurence

    2009-11-17

    Dye-sensitized solar cells (DSCs, also known as Gratzel cells) mimic the photosynthetic process by using a sensitizer dye to harvest light energy to generate electrical power. Several functional features of these photochemical devices are unusual, and DSC research offers a rewarding arena in which to test new ideas, new materials, and new methodologies. Indeed, one of the most attractive chemical features of the DSC is that the basic concept can be used to construct a range of devices, replacing individual components with alternative materials. Despite two decades of increasing research activity, however, many aspects of the behavior of electrons in the DSC remain puzzling. In this Account, we highlight current understanding of the processes involved in the functioning of the DSC, with particular emphasis on what happens to the electrons in the mesoporous film following the injection step. The collection of photoinjected electrons appears to involve a random walk process in which electrons move through the network of interconnected titanium dioxide nanoparticles while undergoing frequent trapping and detrapping. During their passage to the cell contact, electrons may be lost by transfer to tri-iodide species in the redox electrolyte that permeates the mesoporous film. Competition between electron collection and back electron transfer determines the performance of a DSC: ideally, all injected electrons should be collected without loss. This Account then goes on to survey recent experimental and theoretical progress in the field, placing particular emphasis on issues that need to be resolved before we can gain a clear picture of how the DSC works. Several important questions about the behavior of "sticky" electrons, those that undergo multiple trapping and detrapping, in the DSC remain unanswered. The most fundamental of these concerns is the nature of the electron traps that appear to dominate the time-dependent photocurrent and photovoltage response of DSCs. The origin of the nonideality factor in the relationship between the intensity and the DSC photovoltage is also unclear, as is the discrepancy in electron diffusion length values determined by steady-state and non-steady-state methods. With these unanswered questions, DSC research is likely to remain an active and fruitful area for some years to come.

  11. Thermodynamics of micellization from heat-capacity measurements.

    PubMed

    Šarac, Bojan; Bešter-Rogač, Marija; Lah, Jurij

    2014-06-23

    Differential scanning calorimetry (DSC), the most important technique for studying the thermodynamics of structural transitions of biological macromolecules, is seldom used in quantitative thermodynamic studies of surfactant micellization/demicellization. The reason for this could be ascribed to an insufficient understanding of the temperature dependence of the heat capacity of surfactant solutions (DSC data) in terms of thermodynamics, which leads to problems with the design of experiments and interpretation of the output signals. We address these issues by careful design of DSC experiments performed with solutions of ionic and nonionic surfactants at various surfactant concentrations, and individual and global mass-action model analysis of the obtained DSC data. Our approach leads to reliable thermodynamic parameters of micellization for all types of surfactants, comparable with those obtained by using isothermal titration calorimetry (ITC). In summary, we demonstrate that DSC can be successfully used as an independent method to obtain temperature-dependent thermodynamic parameters for micellization. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Degradation of components in drug formulations: a comparison between HPLC and DSC methods.

    PubMed

    Ceschel, G C; Badiello, R; Ronchi, C; Maffei, P

    2003-08-08

    Information about the stability of drug components and drug formulations is needed to predict the shelf-life of the final products. The studies on the interaction between the drug and the excipients may be carried out by means of accelerated stability tests followed by analytical determination of the active principle (HPLC and other methods) and by means of the differential scanning calorimetry (DSC). This research has been focused to the acetyl salicylic acid (ASA) physical-chemical characterisation by using DSC method in order to evaluate its compatibility with some of the most used excipients. It was possible to show, with the DSC method, the incompatibility of magnesium stearate with ASA; the HPLC data confirm the reduction of ASA concentration in the presence of magnesium stearate. With the other excipients the characteristic endotherms of the drug were always present and no or little degradation was observed with the accelerated stability tests. Therefore, the results with the DSC method are comparable and in good agreement with the results obtained with other methods.

  13. DSC of human hair: a tool for claim support or incorrect data analysis?

    PubMed

    Popescu, C; Gummer, C

    2016-10-01

    Differential scanning calorimetry (DSC) data are increasingly used to substantiate product claims of hair repair. Decreasing peak temperatures may indicate structural changes and chemical damage. Increasing the DSC, wet peak temperature is, therefore, often considered as proof of hair repair. A detailed understanding of the technique and hair structure indicates that this may not be a sound approach. Surveying the rich literature on the use of dynamic thermal analysis (DTA) and differential scanning calorimetry (DSC) for the analyses of human hair and the effect of cosmetic treatments, we underline some of the problems of hair structure and data interpretation. To overcome some of the difficulties of data interpretation, we advise that DSC acquired data should be supported by other techniques when used for claim substantiation. In this way, one can provide meaningful interpretation of the hair science and robust data for product claims support. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  14. The random energy model in a magnetic field and joint source channel coding

    NASA Astrophysics Data System (ADS)

    Merhav, Neri

    2008-09-01

    We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.

  15. User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E)

    DTIC Science & Technology

    2014-06-01

    User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E) by James P. Larentzos...Laboratory Aberdeen Proving Ground, MD 21005-5069 ARL-SR-290 June 2014 User Manual and Source Code for a LAMMPS Implementation of Constant...3. DATES COVERED (From - To) September 2013–February 2014 4. TITLE AND SUBTITLE User Manual and Source Code for a LAMMPS Implementation of

  16. Effect of milling on DSC thermogram of excipient adipic acid.

    PubMed

    Ng, Wai Kiong; Kwek, Jin Wang; Yuen, Aaron; Tan, Chin Lee; Tan, Reginald

    2010-03-01

    The purpose of this research was to investigate why and how mechanical milling results in an unexpected shift in differential scanning calorimetry (DSC) measured fusion enthalpy (Delta(fus)H) and melting point (T(m)) of adipic acid, a pharmaceutical excipient. Hyper differential scanning calorimetry (hyper-DSC) was used to characterize adipic acid before and after ball-milling. An experimental study was conducted to evaluate previous postulations such as electrostatic charging using the Faraday cage method, crystallinity loss using powder X-ray diffraction (PXRD), thermal annealing using DSC, impurities removal using thermal gravimetric analysis (TGA) and Karl Fischer titration. DSC thermograms showed that after milling, the values of Delta(fus)H and T(m) were increased by approximately 9% and 5 K, respectively. Previous suggestions of increased electrostatic attraction, change in particle size distribution, and thermal annealing during measurements did not explain the differences. Instead, theoretical analysis and experimental findings suggested that the residual solvent (water) plays a key role. Water entrapped as inclusions inside adipic acid during solution crystallization was partially evaporated by localized heating at the cleaved surfaces during milling. The correlation between the removal of water and melting properties measured was shown via drying and crystallization experiments. These findings show that milling can reduce residual solvent content and causes a shift in DSC results.

  17. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  18. Voxel-based correlation between coregistered single-photon emission computed tomography and dynamic susceptibility contrast magnetic resonance imaging in subjects with suspected Alzheimer disease.

    PubMed

    Cavallin, L; Axelsson, R; Wahlund, L O; Oksengard, A R; Svensson, L; Juhlin, P; Wiberg, M Kristoffersen; Frank, A

    2008-12-01

    Current diagnosis of Alzheimer disease is made by clinical, neuropsychologic, and neuroimaging assessments. Neuroimaging techniques such as magnetic resonance imaging (MRI) and single-photon emission computed tomography (SPECT) could be valuable in the differential diagnosis of Alzheimer disease, as well as in assessing prognosis. To compare SPECT and MRI in a cohort of patients examined for suspected dementia, including patients with no objective cognitive impairment (control group), mild cognitive impairment (MCI), and Alzheimer disease (AD). 24 patients, eight with AD, 10 with MCI, and six controls, were investigated with SPECT using (99m)Tc-hexamethylpropyleneamine oxime (HMPAO, Ceretec; GE Healthcare Ltd., Little Chalsont UK) and dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) with a contrast-enhancing gadobutrol formula (Gadovist; Bayer Schering Pharma, Berlin, Germany). Voxel-based correlation between coregistered SPECT and DSC-MR images was calculated. Region-of-interest (ROI) analyses were then performed in 24 different brain areas using brain registration and analysis of SPECT studies (BRASS; Nuclear Diagnostics AB, Stockholm, Sweden) on both SPECT and DSC-MRI. Voxel-based correlation between coregistered SPECT and DSC-MR showed a high correlation, with a mean correlation coefficient of 0.94. ROI analyses of 24 regions showed significant differences between the control group and AD patients in 10 regions using SPECT and five regions in DSC-MR. SPECT remains superior to DSC-MRI in differentiating normal from pathological perfusion, and DSC-MRI could not replace SPECT in the diagnosis of patients with Alzheimer disease.

  19. Phosphorus transfer in runoff following application of fertilizer, manure, and sewage sludge.

    PubMed

    Withers, P J; Clay, S D; Breeze, V G

    2001-01-01

    Phosphorus (P) transfer in surface runoff from field plots receiving either no P, triplesuperphoshate (TSP), liquid cattle manure (LCS), liquid anaerobically digested sludge (LDS), or dewatered sludge cake (DSC) was compared over a 2-yr period. Dissolved inorganic P concentrations in runoff increased from 0.1 to 0.2 mg L(-1) on control and sludge-treated plots to 3.8 and 6.5 mg L(-1) following application of LCS and TSP, respectively, to a cereal crop in spring. When incorporated into the soil in autumn, runoff dissolved P concentrations were typically < 0.5 mg L(-1) across all plots, and particulate P remained the dominant P form. When surface-applied in autumn to a consolidated seedbed, direct loss of LCS and LDS increased both runoff volume and P transfers, but release of dissolved P occurred only from LCS. The largest P concentrations (>70 mg L(-1)) were recorded following TSP application without any increase in runoff volume, while application of bulky DSC significantly reduced total P transfers by 70% compared with the control due to a reduced runoff volume. Treatment effects in each monitoring period were most pronounced in the first runoff event. Differences in the release of P from the different P sources were related to the amounts of P extracted by either water or sodium bicarbonate in the order TSP > LCS > LDS > DSC. The results suggest there is a lower risk of P transfer in land runoff following application of sludge compared with other agricultural P amendments at similar P rates.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madhukumar, R.; Asha, S.; Rao, B. Lakshmeesha

    The gamma radiation-induced change in structural and thermal properties of Bombyx mori silk fibroin films were investigated and have been correlated with the applied radiation doses. Irradiation of samples were carried out in dry air at room temperature using Co-60 source, and radiation doses are in the range of 0 - 300 kGy. Structural and thermal properties of the irradiated silk films were studied using X-ray diffraction (XRD), Differential Scanning Calorimetry (DSC) and Thermogravimetric analysis (TGA) and compared with unirradiated sample. Interesting results are discussed in this report.

  1. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  2. Characterization of a digital camera as an absolute tristimulus colorimeter

    NASA Astrophysics Data System (ADS)

    Martinez-Verdu, Francisco; Pujol, Jaume; Vilaseca, Meritxell; Capilla, Pascual

    2003-01-01

    An algorithm is proposed for the spectral and colorimetric characterization of digital still cameras (DSC) which allows to use them as tele-colorimeters with CIE-XYZ color output, in cd/m2. The spectral characterization consists of the calculation of the color-matching functions from the previously measured spectral sensitivities. The colorimetric characterization consists of transforming the RGB digital data into absolute tristimulus values CIE-XYZ (in cd/m2) under variable and unknown spectroradiometric conditions. Thus, at the first stage, a gray balance has been applied over the RGB digital data to convert them into RGB relative colorimetric values. At a second stage, an algorithm of luminance adaptation vs. lens aperture has been inserted in the basic colorimetric profile. Capturing the ColorChecker chart under different light sources, the DSC color analysis accuracy indexes, both in a raw state and with the corrections from a linear model of color correction, have been evaluated using the Pointer'86 color reproduction index with the unrelated Hunt'91 color appearance model. The results indicate that our digital image capture device, in raw performance, lightens and desaturates the colors.

  3. The road to JCAHO disease-specific care certification: a step-by-step process log.

    PubMed

    Morrison, Kathy

    2005-01-01

    In 2002, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) implemented Disease-Specific Care (DSC) certification. This is a voluntary program in which organizations have their disease management program evaluated by this regulatory agency. Some of the DSC categories are stroke, heart failure, acute MI, diabetes, and pneumonia. The criteria for any disease management program certification are: compliance with consensus-based national standards, effective use of established clinical practice guidelines to manage and optimize care, and an organized approach to performance measurement and improvement activities. Successful accomplishment of DSC certification defines organizations as Centers of Excellence in management of that particular disease. This article will review general guidelines for DSC certification with an emphasis on Primary Stroke Center certification.

  4. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  5. Combinatorial Optimization Algorithms for Dynamic Multiple Fault Diagnosis in Automotive and Aerospace Applications

    NASA Astrophysics Data System (ADS)

    Kodali, Anuradha

    In this thesis, we develop dynamic multiple fault diagnosis (DMFD) algorithms to diagnose faults that are sporadic and coupled. Firstly, we formulate a coupled factorial hidden Markov model-based (CFHMM) framework to diagnose dependent faults occurring over time (dynamic case). Here, we implement a mixed memory Markov coupling model to determine the most likely sequence of (dependent) fault states, the one that best explains the observed test outcomes over time. An iterative Gauss-Seidel coordinate ascent optimization method is proposed for solving the problem. A soft Viterbi algorithm is also implemented within the framework for decoding dependent fault states over time. We demonstrate the algorithm on simulated and real-world systems with coupled faults; the results show that this approach improves the correct isolation rate as compared to the formulation where independent fault states are assumed. Secondly, we formulate a generalization of set-covering, termed dynamic set-covering (DSC), which involves a series of coupled set-covering problems over time. The objective of the DSC problem is to infer the most probable time sequence of a parsimonious set of failure sources that explains the observed test outcomes over time. The DSC problem is NP-hard and intractable due to the fault-test dependency matrix that couples the failed tests and faults via the constraint matrix, and the temporal dependence of failure sources over time. Here, the DSC problem is motivated from the viewpoint of a dynamic multiple fault diagnosis problem, but it has wide applications in operations research, for e.g., facility location problem. Thus, we also formulated the DSC problem in the context of a dynamically evolving facility location problem. Here, a facility can be opened, closed, or can be temporarily unavailable at any time for a given requirement of demand points. These activities are associated with costs or penalties, viz., phase-in or phase-out for the opening or closing of a facility, respectively. The set-covering matrix encapsulates the relationship among the rows (tests or demand points) and columns (faults or locations) of the system at each time. By relaxing the coupling constraints using Lagrange multipliers, the DSC problem can be decoupled into independent subproblems, one for each column. Each subproblem is solved using the Viterbi decoding algorithm, and a primal feasible solution is constructed by modifying the Viterbi solutions via a heuristic. The proposed Viterbi-Lagrangian relaxation algorithm (VLRA) provides a measure of suboptimality via an approximate duality gap. As a major practical extension of the above problem, we also consider the problem of diagnosing faults with delayed test outcomes, termed delay-dynamic set-covering (DDSC), and experiment with real-world problems that exhibit masking faults. Also, we present simulation results on OR-library datasets (set-covering formulations are predominantly validated on these matrices in the literature), posed as facility location problems. Finally, we implement these algorithms to solve problems in aerospace and automotive applications. Firstly, we address the diagnostic ambiguity problem in aerospace and automotive applications by developing a dynamic fusion framework that includes dynamic multiple fault diagnosis algorithms. This improves the correct fault isolation rate, while minimizing the false alarm rates, by considering multiple faults instead of the traditional data-driven techniques based on single fault (class)-single epoch (static) assumption. The dynamic fusion problem is formulated as a maximum a posteriori decision problem of inferring the fault sequence based on uncertain outcomes of multiple binary classifiers over time. The fusion process involves three steps: the first step transforms the multi-class problem into dichotomies using error correcting output codes (ECOC), thereby solving the concomitant binary classification problems; the second step fuses the outcomes of multiple binary classifiers over time using a sliding window or block dynamic fusion method that exploits temporal data correlations over time. We solve this NP-hard optimization problem via a Lagrangian relaxation (variational) technique. The third step optimizes the classifier parameters, viz., probabilities of detection and false alarm, using a genetic algorithm. The proposed algorithm is demonstrated by computing the diagnostic performance metrics on a twin-spool commercial jet engine, an automotive engine, and UCI datasets (problems with high classification error are specifically chosen for experimentation). We show that the primal-dual optimization framework performed consistently better than any traditional fusion technique, even when it is forced to give a single fault decision across a range of classification problems. Secondly, we implement the inference algorithms to diagnose faults in vehicle systems that are controlled by a network of electronic control units (ECUs). The faults, originating from various interactions and especially between hardware and software, are particularly challenging to address. Our basic strategy is to divide the fault universe of such cyber-physical systems in a hierarchical manner, and monitor the critical variables/signals that have impact at different levels of interactions. The proposed diagnostic strategy is validated on an electrical power generation and storage system (EPGS) controlled by two ECUs in an environment with CANoe/MATLAB co-simulation. Eleven faults are injected with the failures originating in actuator hardware, sensor, controller hardware and software components. Diagnostic matrix is established to represent the relationship between the faults and the test outcomes (also known as fault signatures) via simulations. The results show that the proposed diagnostic strategy is effective in addressing the interaction-caused faults.

  6. Adaptive distributed source coding.

    PubMed

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  7. Phase II evaluation of clinical coding schemes: completeness, taxonomy, mapping, definitions, and clarity. CPRI Work Group on Codes and Structures.

    PubMed

    Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J

    1997-01-01

    To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.

  8. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  9. Impacts of DNAPL Source Treatment: Experimental and Modeling Assessment of the Benefits of Partial DNAPL Source Removal

    DTIC Science & Technology

    2009-09-01

    nuclear industry for conducting performance assessment calculations. The analytical FORTRAN code for the DNAPL source function, REMChlor, was...project. The first was to apply existing deterministic codes , such as T2VOC and UTCHEM, to the DNAPL source zone to simulate the remediation processes...but describe the spatial variability of source zones unlike one-dimensional flow and transport codes that assume homogeneity. The Lagrangian models

  10. Magnetic resonance imaging in dissociated strabismus complex demonstrates generalized hypertrophy of rectus extraocular muscles.

    PubMed

    Rajab, Ghada Z; Suh, Soh Youn; Demer, Joseph L

    2017-06-01

    Dissociated strabismus complex (DSC) is an enigmatic form of strabismus that includes dissociated vertical deviation (DVD) and dissociated horizontal deviation (DHD). We employed magnetic resonance imaging (MRI) to evaluate the extraocular muscles in DSC. We studied 5 patients with DSC and mean age of 25 years (range, 12-42 years), and 15 age-matched, orthotropic control subjects. All patients had DVD; 4 also had DHD. We employed high-resolution, surface coil MRI with thin, 2 mm slices and central target fixation. Volumes of the rectus and superior oblique muscles in the region 12 mm posterior to 4 mm anterior to the globe-optic nerve junction were measured in quasi-coronal planes in central gaze. Patients with DSC had no structural abnormalities of rectus muscles or rectus pulleys or the superior oblique muscle but exhibited modest, statistically significant increased volume of all rectus muscles ranging from 20% for medial rectus to 9% for lateral rectus (P < 0.05). DSC includes various combinations of sursumduction, excycloduction, and abduction not conforming to Hering's law. We have found modest generalized enlargement of all rectus muscles. DSC is associated with generalized rectus extraocular muscle hypertrophy in the absence of other orbital abnormalities. Copyright © 2017 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  11. A Unified Constitutive Model for Subglacial Till, Part II: Laboratory Tests, Disturbed State Modeling, and Validation for Two Subglacial Tills

    NASA Astrophysics Data System (ADS)

    Desai, C. S.; Sane, S. M.; Jenson, J. W.; Contractor, D. N.; Carlson, A. E.; Clark, P. U.

    2006-12-01

    This presentation, which is complementary to Part I (Jenson et al.), describes the application of the Disturbed State Concept (DSC) constitutive model to define the behavior of the deforming sediment (till) underlying glaciers and ice sheets. The DSC includes elastic, plastic, and creep strains, and microstructural changes leading to degradation, failure, and sometimes strengthening or healing. Here, we describe comprehensive laboratory experiments conducted on samples of two regionally significant tills deposited by the Laurentide Ice Sheet: the Tiskilwa Till and Sky Pilot Till. The tests are used to determine the parameters to calibrate the DSC model, which is validated with respect to the laboratory tests by comparing the predictions with test data used to find the parameters, and also comparing them with independent tests not used to find the parameters. Discussion of the results also includes comparison of the DSC model with the classical Mohr-Coulomb model, which has been commonly used for glacial tills. A numerical procedure based on finite element implementation of the DSC is used to simulate an idealized field problem, and its predictions are discussed. Based on these analyses, the unified DSC model is proposed to provide an improved model for subglacial tills compared to other models used commonly, and thus to provide the potential for improved predictions of ice sheet movements.

  12. A study of mercuric iodide near melting using differential scanning calorimetry, Raman spectroscopy and X-ray diffraction

    NASA Astrophysics Data System (ADS)

    Burger, A.; Morgan, S.; Jiang, H.; Silberman, E.; Schieber, M.; Van Den Berg, L.; Keller, L.; Wagner, C. N. J.

    1989-11-01

    High-temperature studies of mercuric iodide (HgI2) involving differential scanning calorimetry (DSC), Raman spectroscopy and X-ray powder diffraction have failed to confirm the existence of a red-colored tetragonal high-temperature phase called α'-HgI2 reported by S.N. Toubektsis et al. [J. Appl. Phys. 58 (1988) 2070] using DSC measurements. The multiple DSC peaks near melting reported by Toubektsis are found by the present authors only if the sample is heated in a stainless-steel container. Using a Pyrex container or inserting a platinum foil between the HgI2 and the stainless-steel container yields only one sharp, single DSC peak at the melting point. The nonexistence of the α' phase is confirmed by high-temperature X-ray diffraction and Raman spectroscopy performed in the vicinity of the melting point. These methods clearly, indicate the existence of only the yellow orthorhombic β-HgI2 phase. The experimental high-temperature DSC, Raman and X-ray diffraction data are presented and discussed.

  13. Dye-sensitized solar cells employing a SnO2-TiO2 core-shell structure made by atomic layer deposition.

    PubMed

    Karlsson, Martin; Jõgi, Indrek; Eriksson, Susanna K; Rensmo, Håkan; Boman, Mats; Boschloo, Gerrit; Hagfeldt, Anders

    2013-01-01

    This paper describes the synthesis and characterization of core-shell structures, based on SnO2 and TiO2, for use in dye-sensitized solar cells (DSC). Atomic layer deposition is employed to control and vary the thickness of the TiO2 shell. Increasing the TiO2 shell thickness to 2 nm improved the device performance of liquid electrolyte-based DSC from 0.7% to 3.5%. The increase in efficiency originates from a higher open-circuit potential and a higher short-circuit current, as well as from an improvement in the electron lifetime. SnO2-TiO2 core-shell DSC devices retain their photovoltage in darkness for longer than 500 seconds, demonstrating that the electrons are contained in the core material. Finally core-shell structures were used for solid-state DSC applications using the hole transporting material 2,2',7,7',-tetrakis(N, N-di-p-methoxyphenyl-amine)-9,9',-spirofluorene. Similar improvements in device performance were obtained for solid-state DSC devices.

  14. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  15. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  16. shiftNMFk 1.1: Robust Nonnegative matrix factorization with kmeans clustering and signal shift, for allocation of unknown physical sources, toy version for open sourcing with publications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.

    This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less

  17. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    NASA Astrophysics Data System (ADS)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  18. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  19. Multidimensional incremental parsing for universal source coding.

    PubMed

    Bae, Soo Hyun; Juang, Biing-Hwang

    2008-10-01

    A multidimensional incremental parsing algorithm (MDIP) for multidimensional discrete sources, as a generalization of the Lempel-Ziv coding algorithm, is investigated. It consists of three essential component schemes, maximum decimation matching, hierarchical structure of multidimensional source coding, and dictionary augmentation. As a counterpart of the longest match search in the Lempel-Ziv algorithm, two classes of maximum decimation matching are studied. Also, an underlying behavior of the dictionary augmentation scheme for estimating the source statistics is examined. For an m-dimensional source, m augmentative patches are appended into the dictionary at each coding epoch, thus requiring the transmission of a substantial amount of information to the decoder. The property of the hierarchical structure of the source coding algorithm resolves this issue by successively incorporating lower dimensional coding procedures in the scheme. In regard to universal lossy source coders, we propose two distortion functions, the local average distortion and the local minimax distortion with a set of threshold levels for each source symbol. For performance evaluation, we implemented three image compression algorithms based upon the MDIP; one is lossless and the others are lossy. The lossless image compression algorithm does not perform better than the Lempel-Ziv-Welch coding, but experimentally shows efficiency in capturing the source structure. The two lossy image compression algorithms are implemented using the two distortion functions, respectively. The algorithm based on the local average distortion is efficient at minimizing the signal distortion, but the images by the one with the local minimax distortion have a good perceptual fidelity among other compression algorithms. Our insights inspire future research on feature extraction of multidimensional discrete sources.

  20. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  1. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  2. Recent advances in coding theory for near error-free communications

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  3. Hybrid concatenated codes and iterative decoding

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Pollara, Fabrizio (Inventor)

    2000-01-01

    Several improved turbo code apparatuses and methods. The invention encompasses several classes: (1) A data source is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each encoder outputs a code element which may be transmitted or stored. A parallel decoder provides the ability to decode the code elements to derive the original source information d without use of a received data signal corresponding to d. The output may be coupled to a multilevel trellis-coded modulator (TCM). (2) A data source d is applied to two or more encoders with an interleaver between the source and each of the second and subsequent encoders. Each of the encoders outputs a code element. In addition, the original data source d is output from the encoder. All of the output elements are coupled to a TCM. (3) At least two data sources are applied to two or more encoders with an interleaver between each source and each of the second and subsequent encoders. The output may be coupled to a TCM. (4) At least two data sources are applied to two or more encoders with at least two interleavers between each source and each of the second and subsequent encoders. (5) At least one data source is applied to one or more serially linked encoders through at least one interleaver. The output may be coupled to a TCM. The invention includes a novel way of terminating a turbo coder.

  4. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    DOEpatents

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  5. Reproducibility of dynamic contrast-enhanced MRI and dynamic susceptibility contrast MRI in the study of brain gliomas: a comparison of data obtained using different commercial software.

    PubMed

    Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta

    2017-04-01

    Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.

  6. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zehtabian, M; Zaker, N; Sina, S

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less

  7. Homoacetogenesis in Deep-Sea Chloroflexi, as Inferred by Single-Cell Genomics, Provides a Link to Reductive Dehalogenation in Terrestrial Dehalococcoidetes.

    PubMed

    Sewell, Holly L; Kaster, Anne-Kristin; Spormann, Alfred M

    2017-12-19

    The deep marine subsurface is one of the largest unexplored biospheres on Earth and is widely inhabited by members of the phylum Chloroflexi In this report, we investigated genomes of single cells obtained from deep-sea sediments of the Peruvian Margin, which are enriched in such Chloroflexi 16S rRNA gene sequence analysis placed two of these single-cell-derived genomes (DscP3 and Dsc4) in a clade of subphylum I Chloroflexi which were previously recovered from deep-sea sediment in the Okinawa Trough and a third (DscP2-2) as a member of the previously reported DscP2 population from Peruvian Margin site 1230. The presence of genes encoding enzymes of a complete Wood-Ljungdahl pathway, glycolysis/gluconeogenesis, a Rhodobacter nitrogen fixation (Rnf) complex, glyosyltransferases, and formate dehydrogenases in the single-cell genomes of DscP3 and Dsc4 and the presence of an NADH-dependent reduced ferredoxin:NADP oxidoreductase (Nfn) and Rnf in the genome of DscP2-2 imply a homoacetogenic lifestyle of these abundant marine Chloroflexi We also report here the first complete pathway for anaerobic benzoate oxidation to acetyl coenzyme A (CoA) in the phylum Chloroflexi (DscP3 and Dsc4), including a class I benzoyl-CoA reductase. Of remarkable evolutionary significance, we discovered a gene encoding a formate dehydrogenase (FdnI) with reciprocal closest identity to the formate dehydrogenase-like protein (complex iron-sulfur molybdoenzyme [CISM], DET0187) of terrestrial Dehalococcoides/Dehalogenimonas spp. This formate dehydrogenase-like protein has been shown to lack formate dehydrogenase activity in Dehalococcoides/Dehalogenimonas spp. and is instead hypothesized to couple HupL hydrogenase to a reductive dehalogenase in the catabolic reductive dehalogenation pathway. This finding of a close functional homologue provides an important missing link for understanding the origin and the metabolic core of terrestrial Dehalococcoides/Dehalogenimonas spp. and of reductive dehalogenation, as well as the biology of abundant deep-sea Chloroflexi IMPORTANCE The deep marine subsurface is one of the largest unexplored biospheres on Earth and is widely inhabited by members of the phylum Chloroflexi In this report, we investigated genomes of single cells obtained from deep-sea sediments and provide evidence for a homacetogenic lifestyle of these abundant marine Chloroflexi Moreover, genome signature and key metabolic genes indicate an evolutionary relationship between these deep-sea sediment microbes and terrestrial, reductively dehalogenating Dehalococcoides . Copyright © 2017 Sewell et al.

  8. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  9. Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.

    PubMed

    Padula, William V; McQueen, Robert Brett; Pronovost, Peter J

    2017-11-01

    The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.

  10. A calorimetric study of precipitation in aluminum alloy 2219

    NASA Astrophysics Data System (ADS)

    Papazian, John M.

    1981-02-01

    Precipitate microstructures in aluminum alloy 2219 were characterized using transmission electron microscopy (TEM) and differential scanning calorimetry (DSC). The DSC signatures of individual precipitate phases were established by comparing the DSC and TEM results from samples that had been aged such that only one precipitate phase was present. These signatures were then used to analyze the commercial tempers. It was found that DSC could readily distinguish between the T3, T4, T6, T8 and O tempers but could not distinguish amongst T81, T851 and T87. Small amounts of plastic deformation between solution treatment and aging had a significant effect on the thermograms. Aging experiments at 130 and 190 °C showed that the aging sequence and DSC response of this alloy were similar to those of pure Al-Cu when the increased copper content is taken into account. Further aging experiments at temperatures between room temperature and 130 °C showed pronounced changes of the GP zone dissolution peak as a function of aging conditions. These changes were found to be related to the effect of GP zone size on the metastable phase boundary and on the GP zone dissolution kinetics.

  11. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude data for point sources. The six codes and their definitions are: (1) Coordinate Data Source Code: The... physical piece of or a closely related set of equipment. The EPA's reporting format for a given inventory...

  12. Structural and thermal properties of γ - irradiated Bombyx mori silk fibroin films

    NASA Astrophysics Data System (ADS)

    Madhukumar, R.; Asha, S.; Sarojini, B. K.; Somashekar, R.; Rao, B. Lakshmeesha; Shivananda, C. S.; Harish, K. V.; Sangappa

    2015-06-01

    The gamma radiation-induced change in structural and thermal properties of Bombyx mori silk fibroin films were investigated and have been correlated with the applied radiation doses. Irradiation of samples were carried out in dry air at room temperature using Co-60 source, and radiation doses are in the range of 0 - 300 kGy. Structural and thermal properties of the irradiated silk films were studied using X-ray diffraction (XRD), Differential Scanning Calorimetry (DSC) and Thermogravimetric analysis (TGA) and compared with unirradiated sample. Interesting results are discussed in this report.

  13. The Astrophysics Source Code Library by the numbers

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.

  14. Astrophysics Source Code Library: Incite to Cite!

    NASA Astrophysics Data System (ADS)

    DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.

    2014-05-01

    The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.

  15. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  16. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  17. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    NASA Astrophysics Data System (ADS)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  18. Data compression for satellite images

    NASA Technical Reports Server (NTRS)

    Chen, P. H.; Wintz, P. A.

    1976-01-01

    An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.

  19. Comparative kinetic analysis on thermal degradation of some cephalosporins using TG and DSC data

    PubMed Central

    2013-01-01

    Background The thermal decomposition of cephalexine, cefadroxil and cefoperazone under non-isothermal conditions using the TG, respectively DSC methods, was studied. In case of TG, a hyphenated technique, including EGA, was used. Results The kinetic analysis was performed using the TG and DSC data in air for the first step of cephalosporin’s decomposition at four heating rates. The both TG and DSC data were processed according to an appropriate strategy to the following kinetic methods: Kissinger-Akahira-Sunose, Friedman, and NPK, in order to obtain realistic kinetic parameters, even if the decomposition process is a complex one. The EGA data offer some valuable indications about a possible decomposition mechanism. The obtained data indicate a rather good agreement between the activation energy’s values obtained by different methods, whereas the EGA data and the chemical structures give a possible explanation of the observed differences on the thermal stability. A complete kinetic analysis needs a data processing strategy using two or more methods, but the kinetic methods must also be applied to the different types of experimental data (TG and DSC). Conclusion The simultaneous use of DSC and TG data for the kinetic analysis coupled with evolved gas analysis (EGA) provided us a more complete picture of the degradation of the three cephalosporins. It was possible to estimate kinetic parameters by using three different kinetic methods and this allowed us to compare the Ea values obtained from different experimental data, TG and DSC. The thermodegradation being a complex process, the both differential and integral methods based on the single step hypothesis are inadequate for obtaining believable kinetic parameters. Only the modified NPK method allowed an objective separation of the temperature, respective conversion influence on the reaction rate and in the same time to ascertain the existence of two simultaneous steps. PMID:23594763

  20. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  1. State-Chart Autocoder

    NASA Technical Reports Server (NTRS)

    Clark, Kenneth; Watney, Garth; Murray, Alexander; Benowitz, Edward

    2007-01-01

    A computer program translates Unified Modeling Language (UML) representations of state charts into source code in the C, C++, and Python computing languages. ( State charts signifies graphical descriptions of states and state transitions of a spacecraft or other complex system.) The UML representations constituting the input to this program are generated by using a UML-compliant graphical design program to draw the state charts. The generated source code is consistent with the "quantum programming" approach, which is so named because it involves discrete states and state transitions that have features in common with states and state transitions in quantum mechanics. Quantum programming enables efficient implementation of state charts, suitable for real-time embedded flight software. In addition to source code, the autocoder program generates a graphical-user-interface (GUI) program that, in turn, generates a display of state transitions in response to events triggered by the user. The GUI program is wrapped around, and can be used to exercise the state-chart behavior of, the generated source code. Once the expected state-chart behavior is confirmed, the generated source code can be augmented with a software interface to the rest of the software with which the source code is required to interact.

  2. Practices in source code sharing in astrophysics

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Wallin, John F.; Allen, Alice; Berriman, Bruce; Teuben, Peter; Nemiroff, Robert J.; Mink, Jessica; Hanisch, Robert J.; DuPrie, Kimberly

    2013-02-01

    While software and algorithms have become increasingly important in astronomy, the majority of authors who publish computational astronomy research do not share the source code they develop, making it difficult to replicate and reuse the work. In this paper we discuss the importance of sharing scientific source code with the entire astrophysics community, and propose that journals require authors to make their code publicly available when a paper is published. That is, we suggest that a paper that involves a computer program not be accepted for publication unless the source code becomes publicly available. The adoption of such a policy by editors, editorial boards, and reviewers will improve the ability to replicate scientific results, and will also make computational astronomy methods more available to other researchers who wish to apply them to their data.

  3. A study of the homogeneity and deviations from stoichiometry in mercuric iodide

    NASA Astrophysics Data System (ADS)

    Burger, A.; Morgan, S.; He, C.; Silberman, E.; van den Berg, L.; Ortale, C.; Franks, L.; Schieber, M.

    1990-01-01

    We have been able to determine the deviations from stoichiometry of mercuric iodide (HgI 2) by using differential scanning calorimetry (DSC). Mercury excess or iodine deficiency in mercuric iodide can be evaluated from the eutectic melting of α-Hgl 2-Hg 2I 2 at 235 °C, which appears as an additional peak in DSC thermograms. I 2 excess can be found from the existence of the I 2-α-HgI 2 eutectic melting at 103°C. An additional DSC peak appears in some samples around 112°C, that could be explained by the presence of iodine inclusions. Using resonance fluorescence spectroscopy (RFS) we have been able to determine the presence of free I 2 that is released by samples during the heating at 120 °C (crystal growth temperature), thus giving additional support to the above DSC results.

  4. A Differential Scanning Calorimetry Method for Construction of Continuous Cooling Transformation Diagram of Blast Furnace Slag

    NASA Astrophysics Data System (ADS)

    Gan, Lei; Zhang, Chunxia; Shangguan, Fangqin; Li, Xiuping

    2012-06-01

    The continuous cooling crystallization of a blast furnace slag was studied by the application of the differential scanning calorimetry (DSC) method. A kinetic model describing the correlation between the evolution of the degree of crystallization with time was obtained. Bulk cooling experiments of the molten slag coupled with numerical simulation of heat transfer were conducted to validate the results of the DSC methods. The degrees of crystallization of the samples from the bulk cooling experiments were estimated by means of the X-ray diffraction (XRD) and the DSC method. It was found that the results from the DSC cooling and bulk cooling experiments are in good agreement. The continuous cooling transformation (CCT) diagram of the blast furnace slag was constructed according to crystallization kinetic model and experimental data. The obtained CCT diagram characterizes with two crystallization noses at different temperature ranges.

  5. Dynamic system classifier.

    PubMed

    Pumpe, Daniel; Greiner, Maksim; Müller, Ewald; Enßlin, Torsten A

    2016-07-01

    Stochastic differential equations describe well many physical, biological, and sociological systems, despite the simplification often made in their derivation. Here the usage of simple stochastic differential equations to characterize and classify complex dynamical systems is proposed within a Bayesian framework. To this end, we develop a dynamic system classifier (DSC). The DSC first abstracts training data of a system in terms of time-dependent coefficients of the descriptive stochastic differential equation. Thereby the DSC identifies unique correlation structures within the training data. For definiteness we restrict the presentation of the DSC to oscillation processes with a time-dependent frequency ω(t) and damping factor γ(t). Although real systems might be more complex, this simple oscillator captures many characteristic features. The ω and γ time lines represent the abstract system characterization and permit the construction of efficient signal classifiers. Numerical experiments show that such classifiers perform well even in the low signal-to-noise regime.

  6. ACIR: automatic cochlea image registration

    NASA Astrophysics Data System (ADS)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  7. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  8. Religiousness, Spirituality, and Salivary Cortisol in Breast Cancer Survivorship: A Pilot Study.

    PubMed

    Hulett, Jennifer M; Armer, Jane M; Leary, Emily; Stewart, Bob R; McDaniel, Roxanne; Smith, Kandis; Millspaugh, Rami; Millspaugh, Joshua

    Psychoneuroimmunological theory suggests a physiological relationship exists between stress, psychosocial-behavioral factors, and neuroendocrine-immune outcomes; however, evidence has been limited. The primary aim of this pilot study was to determine feasibility and acceptability of a salivary cortisol self-collection protocol with a mail-back option for breast cancer survivors. A secondary aim was to examine relationships between religiousness/spirituality (R/S), perceptions of health, and diurnal salivary cortisol (DSC) as a proxy measure for neuroendocrine activity. This was an observational, cross-sectional study. Participants completed measures of R/S, perceptions of health, demographics, and DSC. The sample was composed of female breast cancer survivors (n = 41). Self-collection of DSC using a mail-back option was feasible; validity of mailed salivary cortisol biospecimens was established. Positive spiritual beliefs were the only R/S variable associated with the peak cortisol awakening response (rs = 0.34, P = .03). Poorer physical health was inversely associated with positive spiritual experiences and private religious practices. Poorer mental health was inversely associated with spiritual coping and negative spiritual experiences. Feasibility, validity, and acceptability of self-collected SDC biospecimens with an optional mail-back protocol (at moderate temperatures) were demonstrated. Positive spiritual beliefs were associated with neuroendocrine-mediated peak cortisol awakening response activity; however, additional research is recommended. Objective measures of DSC sampling that include enough collection time points to assess DSC parameters would increase the rigor of future DSC measurement. Breast cancer survivors may benefit from nursing care that includes spiritual assessment and therapeutic conversations that support positive spiritual beliefs.

  9. Kuipers works with DSC Hardware in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-16

    ISS030-E-155917 (16 Jan. 2012) --- European Space Agency astronaut Andre Kuipers, Expedition 30 flight engineer, prepares to place Diffusion Soret Coefficient (DSC) hardware in stowage containers in the Destiny laboratory of the International Space Station.

  10. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization

    PubMed Central

    Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead. PMID:29095934

  11. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  12. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low-Altitude VLF Transmitter

    DTIC Science & Technology

    2007-08-31

    latitude) for 3 different grid spacings. 14 8. Low-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...excellent, validating the new FD code. 16 9. High-altitude fields produced by a 10-kHz source computed using the FD and TD codes. The agreement is...again excellent. 17 10. Low-altitude fields produced by a 20-k.Hz source computed using the FD and TD codes. 17 11. High-altitude fields produced

  13. Review of particle-in-cell modeling for the extraction region of large negative hydrogen ion sources for fusion

    NASA Astrophysics Data System (ADS)

    Wünderlich, D.; Mochalskyy, S.; Montellano, I. M.; Revel, A.

    2018-05-01

    Particle-in-cell (PIC) codes are used since the early 1960s for calculating self-consistently the motion of charged particles in plasmas, taking into account external electric and magnetic fields as well as the fields created by the particles itself. Due to the used very small time steps (in the order of the inverse plasma frequency) and mesh size, the computational requirements can be very high and they drastically increase with increasing plasma density and size of the calculation domain. Thus, usually small computational domains and/or reduced dimensionality are used. In the last years, the available central processing unit (CPU) power strongly increased. Together with a massive parallelization of the codes, it is now possible to describe in 3D the extraction of charged particles from a plasma, using calculation domains with an edge length of several centimeters, consisting of one extraction aperture, the plasma in direct vicinity of the aperture, and a part of the extraction system. Large negative hydrogen or deuterium ion sources are essential parts of the neutral beam injection (NBI) system in future fusion devices like the international fusion experiment ITER and the demonstration reactor (DEMO). For ITER NBI RF driven sources with a source area of 0.9 × 1.9 m2 and 1280 extraction apertures will be used. The extraction of negative ions is accompanied by the co-extraction of electrons which are deflected onto an electron dump. Typically, the maximum negative extracted ion current is limited by the amount and the temporal instability of the co-extracted electrons, especially for operation in deuterium. Different PIC codes are available for the extraction region of large driven negative ion sources for fusion. Additionally, some effort is ongoing in developing codes that describe in a simplified manner (coarser mesh or reduced dimensionality) the plasma of the whole ion source. The presentation first gives a brief overview of the current status of the ion source development for ITER NBI and of the PIC method. Different PIC codes for the extraction region are introduced as well as the coupling to codes describing the whole source (PIC codes or fluid codes). Presented and discussed are different physical and numerical aspects of applying PIC codes to negative hydrogen ion sources for fusion as well as selected code results. The main focus of future calculations will be the meniscus formation and identifying measures for reducing the co-extracted electrons, in particular for deuterium operation. The recent results of the 3D PIC code ONIX (calculation domain: one extraction aperture and its vicinity) for the ITER prototype source (1/8 size of the ITER NBI source) are presented.

  14. An investigation of indomethacin-nicotinamide cocrystal formation induced by thermal stress in the solid or liquid state.

    PubMed

    Lin, Hong-Liang; Zhang, Gang-Chun; Huang, Yu-Ting; Lin, Shan-Yang

    2014-08-01

    The impact of thermal stress on indomethacin (IMC)-nicotinamide (NIC) cocrystal formation with or without neat cogrinding was investigated using differential scanning calorimetry (DSC), Fourier transform infrared (FTIR) microspectroscopy, and simultaneous DSC-FTIR microspectroscopy in the solid or liquid state. Different evaporation methods for preparing IMC-NIC cocrystals were also compared. The results indicated that even after cogrinding for 40 min, the FTIR spectra for all IMC-NIC ground mixtures were superimposable on the FTIR spectra of IMC and NIC components, suggesting there was no cocrystal formation between IMC and NIC after cogrinding. However, these IMC-NIC ground mixtures appear to easily undergo cocrystal formation after the application of DSC determination. Under thermal stress induced by DSC, the amount of cocrystal formation increased with increasing cogrinding time. Moreover, simultaneous DSC-FTIR microspectroscopy was a useful one-step technique to induce and clarify the thermal-induced stepwise mechanism of IMC-NIC cocrystal formation from the ground mixture in real time. Different solvent evaporation rates induced by thermal stress significantly influenced IMC-NIC cocrystal formation in the liquid state. In particular, microwave heating may promote IMC-NIC cocrystal formation in a short time. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  15. Monte Carlo dose calculations of beta-emitting sources for intravascular brachytherapy: a comparison between EGS4, EGSnrc, and MCNP.

    PubMed

    Wang, R; Li, X A

    2001-02-01

    The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.

  16. Linking collection of stormwater runoff to managed aquifer recharge using a geographic information system and hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Teo, E. K.; Young, K. S.; Beganskas, S.; Fisher, A. T.; Lozano, S.; Weir, W. B.; Harmon, R. E.

    2016-12-01

    We are completing a regional analysis of Santa Cruz and northern Monterey Counties, CA to assess conditions for using distributed stormwater collection to support managed aquifer recharge (DSC-MAR). DSC-MAR constitutes an important component in a portfolio of innovative techniques being developed in order to improve groundwater management and to adapt to prolonged drought and changes in climate and anthropogenic water demands by increasing recharge during and soon after winter precipitation events, the season when excess water is most abundant. Our analyses focus specifically on the distributed collection of stormwater runoff, a source that has historically been treated as a nuisance, with the goal of infiltrating ≥100 ac-ft/yr within individual projects. The first part of this project is a spatial analysis, using a geographic information system to combine surface and subsurface data. There is complete spatial coverage for most surface data (elevation, soil and bedrock properties, land use) for the full study region ( 1,400 km2), but subsurface data (aquifer distribution, properties, and storage space) are available for only 43% of the region. Sites that are most suitable for DSC-MAR have high soil infiltration capacity, are well-connected to an underlying aquifer with good transmissive and storage properties, and have space to receive water. Based on surface data, 35% of the region is suitable for MAR (480 km2). In contrast, 14% of the area for which both surface and subsurface datasets are available is suitable for MAR (84 km2). We have assessed the availability of hillslope runoff for collection in support of MAR using a distributed hydrologic model (PRMS) and a catalog of historical, high-resolution climate data. In the simulations, enclosed topographic basins are divided into hydrologic response units (HRUs) having an area of 25 to 250 acres (0.1 to 1 km2). Simulations of the San Lorenzo River Basin (SLRB), northern Santa Cruz County, suggest that during years of normal precipitation, 12% of the region is composed of HRUs that are both suitable for MAR and generate at least 100 acre-feet of runoff per year. These criteria are met by 5% of the SLRB in dry years and 19% in wet years. Collectively, these results suggest that the DSC-MAR approach can help to sustain groundwater resources over the long term.

  17. Polarization Angle Calibration and B-Mode Characterization with the BICEP and Keck Array CMB Telescopes

    NASA Astrophysics Data System (ADS)

    Bullock, Eric

    Since its discovery in 1964, the Cosmic Microwave Background (CMB) has led to widespread acceptance of the Big Bang cosmological paradigm as an explanation for the evolution of the Universe. However, this paradigm does not explain the origin of the initial conditions, leading to such issues as the "horizon problem" and "flatness problem." In the early 1980's, the inflationary paradigm was introduced as a possible source for the initial conditions. This theory postulates that the Universe underwent a period of exponential expansion within a tiny fraction of a second after the beginning. Such an expansion is predicted to inject a stochastic background of gravitational waves that could imprint a detectable B-mode (curl-like) signal in the polarization of the CMB. It is this signal that the family of telescopes used by the B ICEP1, BICEP2, and Keck Array collaborations were designed to detect. These telescopes are small aperture, on-axis, refracting telescopes. We have used the data from these telescopes, particularly BICEP2 and the Keck Array, to place the tightest constraints, as of March 2016, on the tensor-to-scalar ratio of the CMB of r 0.05 < 0.07. In this dissertation, we provide an overview of the Keck Array telescopes and analysis of the data. We also investigate, as the main focus of this dissertation, a device we call the Dielectric Sheet Calibrator (DSC) that is used to measure the polarization angles of our detectors as projected on the sky. With these measurements, we gain the potential to separate the polarization rotation effects of parity-violating physics, such as cosmic birefringence, from a systematic uncertainty on our detectors' polarization angles. Current calibration techniques for polarization sensitive CMB detectors claim an accuracy of +/-0.5°, which sets a limit for determining the usefulness of the DSC. Through a series of consistency tests on a single Keck Array receiver, we demonstrate a statistical uncertainty on the DSC measurements of +/-0.03° and estimate a systematic uncertainty of +/-0.2°. which meets the minimum goal. We also conclude that there is no conflict between the DSC-derived polarization angles of this single receiver and the rotation derived from that receiver's CMB data under the hypothesis of no cosmic birefringence.

  18. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    PubMed

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  19. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    PubMed Central

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  20. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. Tomore » overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.« less

  1. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  2. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  3. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  4. 7 CFR 1717.850 - General.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Earned Ratio (TIER), Debt Service Coverage (DSC), and other case-specific economic and financial factors; (ii) The variability and uncertainty of future revenues, costs, margins, TIER, DSC, and other case... construction work orders and other records, all moneys disbursed from the separate subaccount during the period...

  5. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  6. 40 CFR Appendix A to Subpart A of... - Tables

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... phone number ✓ ✓ (6) FIPS code ✓ ✓ (7) Facility ID codes ✓ ✓ (8) Unit ID code ✓ ✓ (9) Process ID code... for Reporting on Emissions From Nonpoint Sources and Nonroad Mobile Sources, Where Required by 40 CFR... start date ✓ ✓ (3) Inventory end date ✓ ✓ (4) Contact name ✓ ✓ (5) Contact phone number ✓ ✓ (6) FIPS...

  7. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  8. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  9. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  10. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  11. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  12. Learning from adaptive neural dynamic surface control of strict-feedback systems.

    PubMed

    Wang, Min; Wang, Cong

    2015-06-01

    Learning plays an essential role in autonomous control systems. However, how to achieve learning in the nonstationary environment for nonlinear systems is a challenging problem. In this paper, we present learning method for a class of n th-order strict-feedback systems by adaptive dynamic surface control (DSC) technology, which achieves the human-like ability of learning by doing and doing with learned knowledge. To achieve the learning, this paper first proposes stable adaptive DSC with auxiliary first-order filters, which ensures the boundedness of all the signals in the closed-loop system and the convergence of tracking errors in a finite time. With the help of DSC, the derivative of the filter output variable is used as the neural network (NN) input instead of traditional intermediate variables. As a result, the proposed adaptive DSC method reduces greatly the dimension of NN inputs, especially for high-order systems. After the stable DSC design, we decompose the stable closed-loop system into a series of linear time-varying perturbed subsystems. Using a recursive design, the recurrent property of NN input variables is easily verified since the complexity is overcome using DSC. Subsequently, the partial persistent excitation condition of the radial basis function NN is satisfied. By combining a state transformation, accurate approximations of the closed-loop system dynamics are recursively achieved in a local region along recurrent orbits. Then, the learning control method using the learned knowledge is proposed to achieve the closed-loop stability and the improved control performance. Simulation studies are performed to demonstrate the proposed scheme can not only reuse the learned knowledge to achieve the better control performance with the faster tracking convergence rate and the smaller tracking error but also greatly alleviate the computational burden because of reducing the number and complexity of NN input variables.

  13. High-Speed Digital Scan Converter for High-Frequency Ultrasound Sector Scanners

    PubMed Central

    Chang, Jin Ho; Yen, Jesse T.; Shung, K. Kirk

    2008-01-01

    This paper presents a high-speed digital scan converter (DSC) capable of providing more than 400 images per second, which is necessary to examine the activities of the mouse heart whose rate is 5–10 beats per second. To achieve the desired high-speed performance in cost-effective manner, the DSC developed adopts a linear interpolation algorithm in which two nearest samples to each object pixel of a monitor are selected and only angular interpolation is performed. Through computer simulation with the Field II program, its accuracy was investigated by comparing it to that of bilinear interpolation known as the best algorithm in terms of accuracy and processing speed. The simulation results show that the linear interpolation algorithm is capable of providing an acceptable image quality, which means that the difference of the root mean square error (RMSE) values of the linear and bilinear interpolation algorithms is below 1 %, if the sample rate of the envelope samples is at least four times higher than the Nyquist rate for the baseband component of echo signals. The designed DSC was implemented with a single FPGA (Stratix EP1S60F1020C6, Altera Corporation, San Jose, CA) on a DSC board that is a part of a high-speed ultrasound imaging system developed. The temporal and spatial resolutions of the implemented DSC were evaluated by examining its maximum processing time with a time stamp indicating when an image is completely formed and wire phantom testing, respectively. The experimental results show that the implemented DSC is capable of providing images at the rate of 400 images per second with negligible processing error. PMID:18430449

  14. LDPC-based iterative joint source-channel decoding for JPEG2000.

    PubMed

    Pu, Lingling; Wu, Zhenyu; Bilgin, Ali; Marcellin, Michael W; Vasic, Bane

    2007-02-01

    A framework is proposed for iterative joint source-channel decoding of JPEG2000 codestreams. At the encoder, JPEG2000 is used to perform source coding with certain error-resilience (ER) modes, and LDPC codes are used to perform channel coding. During decoding, the source decoder uses the ER modes to identify corrupt sections of the codestream and provides this information to the channel decoder. Decoding is carried out jointly in an iterative fashion. Experimental results indicate that the proposed method requires fewer iterations and improves overall system performance.

  15. Sorting of a multi-subunit ubiquitin ligase complex in the endolysosome system

    PubMed Central

    Yang, Xi; Arines, Felichi Mae; Zhang, Weichao

    2018-01-01

    The yeast Dsc E3 ligase complex has long been recognized as a Golgi-specific protein ubquitination system. It shares a striking sequence similarity to the Hrd1 complex that plays critical roles in the ER-associated degradation pathway. Using biochemical purification and mass spectrometry, we identified two novel Dsc subunits, which we named as Gld1 and Vld1. Surprisingly, Gld1 and Vld1 do not coexist in the same complex. Instead, they compete with each other to form two functionally independent Dsc subcomplexes. The Vld1 subcomplex takes the AP3 pathway to reach the vacuole membrane, whereas the Gld1 subcomplex travels through the VPS pathway and is cycled between Golgi and endosomes by the retromer. Thus, instead of being Golgi-specific, the Dsc complex can regulate protein levels at three distinct organelles, namely Golgi, endosome, and vacuole. Our study provides a novel model of achieving multi-tasking for transmembrane ubiquitin ligases with interchangeable trafficking adaptors. PMID:29355480

  16. Determination of the heat of hydride formation/decomposition by high-pressure differential scanning calorimetry (HP-DSC).

    PubMed

    Rongeat, Carine; Llamas-Jansa, Isabel; Doppiu, Stefania; Deledda, Stefano; Borgschulte, Andreas; Schultz, Ludwig; Gutfleisch, Oliver

    2007-11-22

    Among the thermodynamic properties of novel materials for solid-state hydrogen storage, the heat of formation/decomposition of hydrides is the most important parameter to evaluate the stability of the compound and its temperature and pressure of operation. In this work, the desorption and absorption behaviors of three different classes of hydrides are investigated under different hydrogen pressures using high-pressure differential scanning calorimetry (HP-DSC). The HP-DSC technique is used to estimate the equilibrium pressures as a function of temperature, from which the heat of formation is derived. The relevance of this procedure is demonstrated for (i) magnesium-based compounds (Ni-doped MgH2), (ii) Mg-Co-based ternary hydrides (Mg-CoHx) and (iii) Alanate complex hydrides (Ti-doped NaAlH4). From these results, it can be concluded that HP-DSC is a powerful tool to obtain a good approximation of the thermodynamic properties of hydride compounds by a simple and fast study of desorption and absorption properties under different pressures.

  17. Psychological stress during exercise: cardiorespiratory and hormonal responses.

    PubMed

    Webb, Heather E; Weldy, Michael L; Fabianke-Kadue, Emily C; Orndorff, G R; Kamimori, Gary H; Acevedo, Edmund O

    2008-12-01

    The purpose of this study was to examine the cardiorespiratory (CR) and stress hormone responses to a combined physical and mental stress. Eight participants (VO2(max) = 41.24 +/- 6.20 ml kg(-1) min(-1)) completed two experimental conditions, a treatment condition including a 37 min ride at 60% of VO2(max) with participants responding to a computerized mental challenge dual stress condition (DSC) and a control condition of the same duration and intensity without the mental challenge exercise alone condition (EAC). Significant interactions across time were found for CR responses, with heart rate, ventilation, and respiration rate demonstrating higher increases in the DSC. Additionally, norepinephrine was significantly greater in the DSC at the end of the combined challenge. Furthermore, cortisol area-under-the-curve (AUC) was also significantly elevated during the DSC. These results demonstrate that a mental challenge during exercise can exacerbate the stress response, including the release of hormones that have been linked to negative health consequences (cardiovascular, metabolic, autoimmune illnesses).

  18. Ionic liquids: differential scanning calorimetry as a new indirect method for determination of vaporization enthalpies.

    PubMed

    Verevkin, Sergey P; Emel'yanenko, Vladimir N; Zaitsau, Dzmitry H; Ralys, Ricardas V; Schick, Christoph

    2012-04-12

    Differential scanning calorimetry (DSC) has been used to measure enthalpies of synthesis reactions of the 1-alkyl-3-methylimidazolium bromide [C(n)mim][Br] ionic liquids from 1-methylimidazole and n-alkyl bromides (with n = 4, 5, 6, 7, and 8). The optimal experimental conditions have been elaborated. Enthalpies of formation of these ionic liquids in the liquid state have been determined using the DSC results according to the Hess Law. The ideal-gas enthalpies of formation of [C(n)mim][Br] were calculated using the methods of quantum chemistry. They were used together with the DSC results to derive indirectly the enthalpies of vaporization of the ionic liquids under study. In order to validate the indirect determination, the experimental vaporization enthalpy of [C(4)mim][Br] was measured by using a quartz crystal microbalance (QCM). The combination of reaction enthalpy measurements by DSC with modern high-level first-principles calculations opens valuable indirect thermochemical options to obtain values of vaporization enthalpies of ionic liquids.

  19. Non-resonant dynamic stark control of vibrational motion with optimized laser pulses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Esben F.; Henriksen, Niels E.

    2016-06-28

    The term dynamic Stark control (DSC) has been used to describe methods of quantum control related to the dynamic Stark effect, i.e., a time-dependent distortion of energy levels. Here, we employ analytical models that present clear and concise interpretations of the principles behind DSC. Within a linearly forced harmonic oscillator model of vibrational excitation, we show how the vibrational amplitude is related to the pulse envelope, and independent of the carrier frequency of the laser pulse, in the DSC regime. Furthermore, we shed light on the DSC regarding the construction of optimal pulse envelopes — from a time-domain as wellmore » as a frequency-domain perspective. Finally, in a numerical study beyond the linearly forced harmonic oscillator model, we show that a pulse envelope can be constructed such that a vibrational excitation into a specific excited vibrational eigenstate is accomplished. The pulse envelope is constructed such that high intensities are avoided in order to eliminate the process of ionization.« less

  20. Calorimetric evidence for two distinct molecular packing arrangements in stable glasses of indomethacin.

    PubMed

    Kearns, Kenneth L; Swallen, Stephen F; Ediger, M D; Sun, Ye; Yu, Lian

    2009-02-12

    Indomethacin glasses of varying stabilities were prepared by physical vapor deposition onto substrates at 265 K. Enthalpy relaxation and the mobility onset temperature were assessed with differential scanning calorimetry (DSC). Quasi-isothermal temperature-modulated DSC was used to measure the reversing heat capacity during annealing above the glass transition temperature Tg. At deposition rates near 8 A/s, scanning DSC shows two enthalpy relaxation peaks and quasi-isothermal DSC shows a two-step change in the reversing heat capacity. We attribute these features to two distinct local packing structures in the vapor-deposited glass, and this interpretation is supported by the strong correlation between the two calorimetric signatures of the glass to liquid transformation. At lower deposition rates, a larger fraction of the sample is prepared in the more stable local packing. The transformation of the vapor-deposited glasses into the supercooled liquid above Tg is exceedingly slow, as much as 4500 times slower than the structural relaxation time of the liquid.

  1. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... funds; (ii) Studies, analyses, test data, or similar data produced for this contract, when the study...

  2. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polan, D; Kamp, J; Lee, JY

    Purpose: To perform validation and commissioning of a commercial deformable image registration (DIR) algorithm (Velocity, Varian Medical Systems) for numerous clinical sites using single and multi-modality images. Methods: In this retrospective study, the DIR algorithm was evaluated for 10 patients in each of the following body sites: head and neck (HN), prostate, liver, and gynecological (GYN). HN DIRs were evaluated from planning (p)CT to re-pCT and pCTs to daily CBCTs using dice similarity coefficients (DSC) of corresponding anatomical structures. Prostate DIRs were evaluated from pCT to CBCTs using DSC and target registration error (TRE) of implanted RF beacons within themore » prostate. Liver DIRs were evaluated from pMR to pCT using DSC and TRE of vessel bifurcations. GYN DIRs were evaluated between fractionated brachytherapy MRIs using DSC of corresponding anatomical structures. Results: Analysis to date has given average DSCs for HN pCT-to-(re)pCT DIR for the brainstem, cochleas, constrictors, spinal canal, cord, esophagus, larynx, parotids, and submandibular glands as 0.88, 0.65, 0.67, 0.91, 0.77, 0.69, 0.77, 0.87, and 0.71, respectively. Average DSCs for HN pCT-to-CBCT DIR for the constrictors, spinal canal, esophagus, larynx, parotids, and submandibular glands were 0.64, 0.90, 0.62, 0.82, 0.75, and 0.69, respectively. For prostate pCT-to-CBCT DIR the DSC for the bladder, femoral heads, prostate, and rectum were 0.71, 0.82, 0.69, and 0.61, respectively. Average TRE using implanted beacons was 3.35 mm. For liver pCT-to-pMR, the average liver DSC was 0.94 and TRE was 5.26 mm. For GYN MR-to-MR DIR the DSC for the bladder, sigmoid colon, GTV, and rectum were 0.79, 0.58, 0.67, and 0.76, respectively. Conclusion: The Velocity DIR algorithm has been evaluated over a number of anatomical sites. This work functions to document the uncertainties in the DIR in the commissioning process so that these can be accounted for in the development of downstream clinical processes. This work was supported in part by a co-development agreement with Varian Medical Systems.« less

  4. SU-D-18C-02: Feasibility of Using a Short ASL Scan for Calibrating Cerebral Blood Flow Obtained From DSC-MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, P; Chang, T; Huang, K

    2014-06-01

    Purpose: This study aimed to evaluate the feasibility of using a short arterial spin labeling (ASL) scan for calibrating the dynamic susceptibility contrast- (DSC-) MRI in a group of patients with internal carotid artery stenosis. Methods: Six patients with unilateral ICA stenosis enrolled in the study on a 3T clinical MRI scanner. The ASL-cerebral blood flow (-CBF) maps were calculated by averaging different number of dynamic points (N=1-45) acquired by using a Q2TIPS sequence. For DSC perfusion analysis, arterial input function was selected to derive the relative cerebral blood flow (rCBF) map and the delay (Tmax) map. Patient-specific CF wasmore » calculated from the mean ASL- and DSC-CBF obtained from three different masks: (1)Tmax< 3s, (2)combined gray matter mask with mask 1, (3)mask 2 with large vessels removed. One CF value was created for each number of averages by using each of the three masks for calibrating the DSC-CBF map. The CF value of the largest number of averages (NL=45) was used to determine the acceptable range(< 10%, <15%, and <20%) of CF values corresponding to the minimally acceptable number of average (NS) for each patient. Results: Comparing DSC CBF maps corrected by CF values of NL (CBFL) in ACA, MCA and PCA territories, all masks resulted in smaller CBF on the ipsilateral side than the contralateral side of the MCA territory(p<.05). The values obtained from mask 1 were significantly different than the mask 3(p<.05). Using mask 3, the medium values of Ns were 4(<10%), 2(<15%) and 2(<20%), with the worst case scenario (maximum Ns) of 25, 4, and 4, respectively. Conclusion: This study found that reliable calibration of DSC-CBF can be achieved from a short pulsed ASL scan. We suggested use a mask based on the Tmax threshold, the inclusion of gray matter only and the exclusion of large vessels for performing the calibration.« less

  5. Effect of additives on mineral trioxide aggregate setting reaction product formation.

    PubMed

    Zapf, Angela M; Chedella, Sharath C V; Berzins, David W

    2015-01-01

    Mineral trioxide aggregate (MTA) sets via hydration of calcium silicates to yield calcium silicate hydrates and calcium hydroxide (Ca[OH]2). However, a drawback of MTA is its long setting time. Therefore, many additives have been suggested to reduce the setting time. The effect those additives have on setting reaction product formation has been ignored. The objective was to examine the effect additives have on MTA's setting time and setting reaction using differential scanning calorimetry (DSC). MTA powder was prepared with distilled water (control), phosphate buffered saline, 5% calcium chloride (CaCl2), 3% sodium hypochlorite (NaOCl), or lidocaine in a 3:1 mixture and placed in crucibles for DSC evaluation. The setting exothermic reactions were evaluated at 37°C for 8 hours to determine the setting time. Separate samples were stored and evaluated using dynamic DSC scans (37°C→640°C at10°C/min) at 1 day, 1 week, 1 month, and 3 months (n = 9/group/time). Dynamic DSC quantifies the reaction product formed from the amount of heat required to decompose it. Thermographic peaks were integrated to determine enthalpy, which was analyzed with analysis of variance/Tukey test (α = 0.05). Isothermal DSC identified 2 main exothermal peaks occurring at 44 ± 12 and 343 ± 57 minutes for the control. Only the CaCl2 additive was an accelerant, which was observed by a greater exothermic peak at 101 ± 11 minutes, indicating a decreased setting time. The dynamic DSC scans produced an endothermic peak around 450°C-550°C attributed to Ca(OH)2 decomposition. The use of a few additives (NaOCl and lidocaine) resulted in significantly less Ca(OH)2 product formation. DSC was used to discriminate calcium hydroxide formation in MTA mixed with various additives and showed NaOCl and lidocaine are detrimental to MTA reaction product formation, whereas CaCl2 accelerated the reaction. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. Use of thermal analysis coupled with differential scanning calorimetry, quadrupole mass spectrometry and infrared spectroscopy (TG-DSC-QMS-FTIR) to monitor chemical properties and thermal stability of fulvic and humic acids.

    PubMed

    Boguta, Patrycja; Sokołowska, Zofia; Skic, Kamil

    2017-01-01

    Thermogravimetry-coupled with differential scanning calorimetry, quadrupole mass spectrometry, and Fourier-transform infrared spectroscopy (TG-DSC-QMS-FTIR)-was applied to monitor the thermal stability (in an N2 pyrolytic atmosphere) and chemical properties of natural polymers, fulvic (FA) and humic acids (HA), isolated from chemically different soils. Three temperature ranges, R1, 40-220°C; R2, 220-430°C; and R3, 430-650°C, were distinguished from the DSC data, related to the main thermal processes of different structures (including transformations without weight loss). Weight loss (ΔM) estimated from TG curves at the above temperature intervals revealed distinct differences within the samples in the content of physically adsorbed water (at R1), volatile and labile functional groups (at R2) as well as recalcitrant and refractory structures (at R3). QMS and FTIR modules enabled the chemical identification (by masses and by functional groups, respectively) of gaseous species evolved during thermal decomposition at R1, R2 and R3. Variability in shape, area and temperature of TG, DSC, QMS and FTIR peaks revealed differences in thermal stability and chemical structure of the samples between the FAs and HAs fractions of different origin. The statistical analysis showed that the parameters calculated from QMS (areas of m/z = 16, 17, 18, 44), DSC (MaxDSC) and TG (ΔM) at R1, R2 and R3 correlated with selected chemical properties of the samples, such as N, O and COOH content as well as E2/E6 and E2/E4 indexes. This indicated a high potential for the coupled method to monitor the chemical changes of humic substances. A new humification parameter, HTD, based on simple calculations of weight loss at specific temperature intervals proved to be a good alternative to indexes obtained from other methods. The above findings showed that the TG-DSC-QMS-FTIR coupled technique can represent a useful tool for the comprehensive assessment of FAs and HAs properties related to their various origin.

  7. Use of thermal analysis coupled with differential scanning calorimetry, quadrupole mass spectrometry and infrared spectroscopy (TG-DSC-QMS-FTIR) to monitor chemical properties and thermal stability of fulvic and humic acids

    PubMed Central

    Sokołowska, Zofia; Skic, Kamil

    2017-01-01

    Thermogravimetry–coupled with differential scanning calorimetry, quadrupole mass spectrometry, and Fourier-transform infrared spectroscopy (TG-DSC-QMS-FTIR)–was applied to monitor the thermal stability (in an N2 pyrolytic atmosphere) and chemical properties of natural polymers, fulvic (FA) and humic acids (HA), isolated from chemically different soils. Three temperature ranges, R1, 40–220°C; R2, 220–430°C; and R3, 430–650°C, were distinguished from the DSC data, related to the main thermal processes of different structures (including transformations without weight loss). Weight loss (ΔM) estimated from TG curves at the above temperature intervals revealed distinct differences within the samples in the content of physically adsorbed water (at R1), volatile and labile functional groups (at R2) as well as recalcitrant and refractory structures (at R3). QMS and FTIR modules enabled the chemical identification (by masses and by functional groups, respectively) of gaseous species evolved during thermal decomposition at R1, R2 and R3. Variability in shape, area and temperature of TG, DSC, QMS and FTIR peaks revealed differences in thermal stability and chemical structure of the samples between the FAs and HAs fractions of different origin. The statistical analysis showed that the parameters calculated from QMS (areas of m/z = 16, 17, 18, 44), DSC (MaxDSC) and TG (ΔM) at R1, R2 and R3 correlated with selected chemical properties of the samples, such as N, O and COOH content as well as E2/E6 and E2/E4 indexes. This indicated a high potential for the coupled method to monitor the chemical changes of humic substances. A new humification parameter, HTD, based on simple calculations of weight loss at specific temperature intervals proved to be a good alternative to indexes obtained from other methods. The above findings showed that the TG-DSC-QMS-FTIR coupled technique can represent a useful tool for the comprehensive assessment of FAs and HAs properties related to their various origin. PMID:29240819

  8. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  9. Awareware: Narrowcasting Attributes for Selective Attention, Privacy, and Multipresence

    NASA Astrophysics Data System (ADS)

    Cohen, Michael; Newton Fernando, Owen Noel

    The domain of cscw, computer-supported collaborative work, and DSC, distributed synchronous collaboration, spans real-time interactive multiuser systems, shared information spaces, and applications for teleexistence and artificial reality, including collaborative virtual environments ( cves) (Benford et al., 2001). As presence awareness systems emerge, it is important to develop appropriate interfaces and architectures for managing multimodal multiuser systems. Especially in consideration of the persistent connectivity enabled by affordable networked communication, shared distributed environments require generalized control of media streams, techniques to control source → sink transmissions in synchronous groupware, including teleconferences and chatspaces, online role-playing games, and virtual concerts.

  10. Differential scanning calorimetry: An invaluable tool for a detailed thermodynamic characterization of macromolecules and their interactions

    PubMed Central

    Chiu, Michael H.; Prenner, Elmar J.

    2011-01-01

    Differential Scanning Calorimetry (DSC) is a highly sensitive technique to study the thermotropic properties of many different biological macromolecules and extracts. Since its early development, DSC has been applied to the pharmaceutical field with excipient studies and DNA drugs. In recent times, more attention has been applied to lipid-based drug delivery systems and drug interactions with biomimetic membranes. Highly reproducible phase transitions have been used to determine values, such as, the type of binding interaction, purity, stability, and release from a drug delivery mechanism. This review focuses on the use of DSC for biochemical and pharmaceutical applications. PMID:21430954

  11. Two-terminal video coding.

    PubMed

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  12. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  13. Determining the main thermodynamic parameters of caffeine melting by means of DSC

    NASA Astrophysics Data System (ADS)

    Agafonova, E. V.; Moshchenskii, Yu. V.; Tkachenko, M. L.

    2012-06-01

    The temperature and enthalpy of the melting of caffeine, which are 235.5 ± 0.1°C and 19.6 ± 0.2 kJ/mol, respectively, are determined by DSC. The melting entropy and the cryoscopic constant of caffeine are calculated.

  14. 47 CFR 80.225 - Requirements for selective calling equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... manufacture, importation, sale or installation of non-portable DSC equipment that does not comply with either..., 2011. (5) The manufacture, importation, or sale of handheld, portable DSC equipment that does not... to establish or maintain communications provided that: (i) These signalling techniques are not used...

  15. Thermal Analysis of Plastics

    ERIC Educational Resources Information Center

    D'Amico, Teresa; Donahue, Craig J.; Rais, Elizabeth A.

    2008-01-01

    This lab experiment illustrates the use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) in the measurement of polymer properties. A total of seven exercises are described. These are dry exercises: students interpret previously recorded scans. They do not perform the experiments. DSC was used to determine the…

  16. Determining the critical relative humidity at which the glassy to rubbery transition occurs in polydextrose using an automatic water vapor sorption instrument.

    PubMed

    Yuan, Xiaoda; Carter, Brady P; Schmidt, Shelly J

    2011-01-01

    Similar to an increase in temperature at constant moisture content, water vapor sorption by an amorphous glassy material at constant temperature causes the material to transition into the rubbery state. However, comparatively little research has investigated the measurement of the critical relative humidity (RHc) at which the glass transition occurs at constant temperature. Thus, the central objective of this study was to investigate the relationship between the glass transition temperature (Tg), determined using thermal methods, and the RHc obtained using an automatic water vapor sorption instrument. Dynamic dewpoint isotherms were obtained for amorphous polydextrose from 15 to 40 °C. RHc was determined using an optimized 2nd-derivative method; however, 2 simpler RHc determination methods were also tested as a secondary objective. No statistical difference was found between the 3 RHc methods. Differential scanning calorimetry (DSC) Tg values were determined using polydextrose equilibrated from 11.3% to 57.6% RH. Both standard DSC and modulated DSC (MDSC) methods were employed, since some of the polydextrose thermograms exhibited a physical aging peak. Thus, a tertiary objective was to compare Tg values obtained using 3 different methods (DSC first scan, DSC rescan, and MDSC), to determine which method(s) yielded the most accurate Tg values. In general, onset and midpoint DSC first scan and MDSC Tg values were similar, whereas onset and midpoint DSC rescan values were different. State diagrams of RHc and experimental temperature and Tg and %RH were compared. These state diagrams, though obtained via very different methods, showed relatively good agreement, confirming our hypothesis that water vapor sorption isotherms can be used to directly detect the glassy to rubbery transition. Practical Application: The food polymer science (FPS) approach, pioneered by Slade and Levine, is being successfully applied in the food industry for understanding, improving, and developing food processes and products. However, despite its extreme usefulness, the Tg, a key element of the FPS approach, remains a challenging parameter to routinely measure in amorphous food materials, especially complex materials. This research demonstrates that RHc values, obtained at constant temperature using an automatic water vapor sorption instrument, can be used to detect the glassy to rubbery transition and are similar to the Tg values obtained at constant %RH, especially considering the very different approaches of these 2 methods--a transition from surface adsorption to bulk absorption (water vapor sorption) versus a step change in the heat capacity (DSC thermal method).

  17. Serious game training improves performance in combat life-saving interventions.

    PubMed

    Planchon, Jerome; Vacher, Anthony; Comblet, Jeremy; Rabatel, Eric; Darses, Françoise; Mignon, Alexandre; Pasquier, Pierre

    2018-01-01

    In modern warfare, almost 25% of combat-related deaths are considered preventable if life-saving interventions are performed. Therefore, Tactical Combat Casualty Care (TCCC) training for soldiers is a major challenge. In 2014, the French Military Medical Service supported the development of 3D-SC1 ® , a serious game designed for the French TCCC program, entitled Sauvetage au Combat de niveau 1 (SC1). Our study aimed to evaluate the impact on performance of additional training with 3D-SC1 ® . The study assessed the performance of soldiers randomly assigned to one of two groups, before (measure 1) and after (measure 2) receiving additional training. This training involved either 3D-SC1 ® (Intervention group), or a DVD (Control group). The principal measure was the individual performance (on a 16-point scale), assessed by two investigators during a hands-on simulation. First, the mean performance score was compared between the two measures for Intervention and Control groups using a two-tailed paired t-test. Second, a multivariable linear regression was used to determine the difference in the impacts of 3D-SC1 ® and DVD training, and the order of presentation of the two scenarios, on the mean change from baseline in performance scores. A total of 96 subjects were evaluated: seven could not be followed-up, while 50 were randomly allocated to the Intervention group, and 39 to the Control group. Between measure 1 and measure 2, the mean (SD) performance score increased from 9.9 (3.13) to 14.1 (1.23), and from 9.4 (2.97) to 12.5 (1.83), for the Intervention group and Control group, respectively (p<0.0001). The adjusted mean difference in performance scores between 3D-SC1 ® and DVD training was 1.1 (95% confidence interval -0.3, 2.5) (p=0.14). Overall, the study found that supplementing SC1 training with either 3D-SC1 ® or DVD improved performance, assessed by a hands-on simulation. However, our analysis did not find a statistically significant difference between the effects of these two training tools. 3D-SC1 ® could be an efficient and pedagogical tool to train soldiers in life-saving interventions. In the current context of terrorist threat, a specifically-adapted version of 3D-SC1 ® may be a cost-effective and engaging way to train a large civilian public. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Crystallization processes in Ge{sub 2}Sb{sub 2}Se{sub 4}Te glass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svoboda, Roman, E-mail: roman.svoboda@upce.cz; Bezdička, Petr; Gutwirth, Jan

    2015-01-15

    Highlights: • Crystallization kinetics of Ge{sub 2}Sb{sub 2}Se{sub 4}Te glass was studied in dependence on particle size by DSC. • All studied fractions were described in terms of the SB autocatalytic model. • Relatively high amount of Te enhances manifestation of bulk crystallization mechanisms. • XRD analysis of samples crystallized under different conditions showed correlation with DSC data. • XRD analysis revealed a new crystallization mechanism indistinguishable by DSC. - Abstract: Differential scanning calorimetry (DSC) and X-ray diffraction (XRD) analysis were used to study crystallization in Ge{sub 2}Sb{sub 2}Se{sub 4}Te glass under non-isothermal conditions as a function of the particlemore » size. The crystallization kinetics was described in terms of the autocatalytic Šesták–Berggren model. An extensive discussion of all aspects of a full-scale kinetic study of a crystallization process was undertaken. Dominance of the crystallization process originating from mechanically induced strains and heterogeneities was confirmed. Substitution of Se by Te was found to enhance the manifestation of the bulk crystallization mechanisms (at the expense of surface crystallization). The XRD analysis showed significant dependence of the crystalline structural parameters on the crystallization conditions (initial particle size of the glassy grains and applied heating rate). Based on this information, a new microstructural crystallization mechanism, indistinguishable by DSC, was proposed.« less

  19. Social Media Impact of the Food and Drug Administration's Drug Safety Communication Messaging About Zolpidem: Mixed-Methods Analysis

    PubMed Central

    Sinha, Michael S; Freifeld, Clark C; Brownstein, John S; Donneyong, Macarius M; Rausch, Paula; Lappin, Brian M; Zhou, Esther H; Dal Pan, Gerald J; Pawar, Ajinkya M; Hwang, Thomas J; Avorn, Jerry

    2018-01-01

    Background The Food and Drug Administration (FDA) issues drug safety communications (DSCs) to health care professionals, patients, and the public when safety issues emerge related to FDA-approved drug products. These safety messages are disseminated through social media to ensure broad uptake. Objective The objective of this study was to assess the social media dissemination of 2 DSCs released in 2013 for the sleep aid zolpidem. Methods We used the MedWatcher Social program and the DataSift historic query tool to aggregate Twitter and Facebook posts from October 1, 2012 through August 31, 2013, a period beginning approximately 3 months before the first DSC and ending 3 months after the second. Posts were categorized as (1) junk, (2) mention, and (3) adverse event (AE) based on a score between –0.2 (completely unrelated) to 1 (perfectly related). We also looked at Google Trends data and Wikipedia edits for the same time period. Google Trends search volume is scaled on a range of 0 to 100 and includes “Related queries” during the relevant time periods. An interrupted time series (ITS) analysis assessed the impact of DSCs on the counts of posts with specific mention of zolpidem-containing products. Chow tests for known structural breaks were conducted on data from Twitter, Facebook, and Google Trends. Finally, Wikipedia edits were pulled from the website’s editorial history, which lists all revisions to a given page and the editor’s identity. Results In total, 174,286 Twitter posts and 59,641 Facebook posts met entry criteria. Of those, 16.63% (28,989/174,286) of Twitter posts and 25.91% (15,453/59,641) of Facebook posts were labeled as junk and excluded. AEs and mentions represented 9.21% (16,051/174,286) and 74.16% (129,246/174,286) of Twitter posts and 5.11% (3,050/59,641) and 68.98% (41,138/59,641) of Facebook posts, respectively. Total daily counts of posts about zolpidem-containing products increased on Twitter and Facebook on the day of the first DSC; Google searches increased on the week of the first DSC. ITS analyses demonstrated variability but pointed to an increase in interest around the first DSC. Chow tests were significant (P<.0001) for both DSCs on Facebook and Twitter, but only the first DSC on Google Trends. Wikipedia edits occurred soon after each DSC release, citing news articles rather than the DSC itself and presenting content that needed subsequent revisions for accuracy. Conclusions Social media offers challenges and opportunities for dissemination of the DSC messages. The FDA could consider strategies for more actively disseminating DSC safety information through social media platforms, particularly when announcements require updating. The FDA may also benefit from directly contributing content to websites like Wikipedia that are frequently accessed for drug-related information. PMID:29305342

  20. Using the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.

    2013-01-01

    The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.

  1. McSKY: A hybrid Monte-Carlo lime-beam code for shielded gamma skyshine calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Stedry, M.H.

    1994-07-01

    McSKY evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated into either a vertical cone or a vertical structure with an N-sided polygon cross section. The code assumes an overhead shield of two materials, through the user can specify zero shield thickness for an unshielded calculation. The code uses a Monte-Carlo algorithm to evaluate transport through source shields and the integral line source to describe photon transport through the atmosphere. The source energy must be between 0.02 and 100 MeV. For heavily shielded sources with energies above 20 MeV, McSKY results must be used cautiously, especially at detectormore » locations near the source.« less

  2. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  3. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  4. 47 CFR 80.179 - Unattended operation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DSC in accordance with ITU-R Recommendation M.493-11, “Digital Selective-calling System for Use in the...., Washington, DC (Reference Information Center) or at the National Archives and Records Administration (NARA... condition related to ship safety. (3) The “ROUTINE” DSC category must be used. (4) Communications must be...

  5. 47 CFR 80.359 - Frequencies for digital selective calling (DSC).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... calling frequencies for use by authorized ship and coast stations for general purpose DSC. There are three.... The “Series A” designation includes coast stations along, and ship stations in, the Atlantic Ocean... location of the called station and propagation conditions. Acknowledgement is made on the paired frequency...

  6. 47 CFR 80.359 - Frequencies for digital selective calling (DSC).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... calling frequencies for use by authorized ship and coast stations for general purpose DSC. There are three.... The “Series A” designation includes coast stations along, and ship stations in, the Atlantic Ocean... location of the called station and propagation conditions. Acknowledgement is made on the paired frequency...

  7. 47 CFR 80.359 - Frequencies for digital selective calling (DSC).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... calling frequencies for use by authorized ship and coast stations for general purpose DSC. There are three.... The “Series A” designation includes coast stations along, and ship stations in, the Atlantic Ocean... location of the called station and propagation conditions. Acknowledgement is made on the paired frequency...

  8. Structure and Phase Transitions of Poly (Hexamethylene p,p'-Bibenzoate) as Studied by DSC and Real-Time SAXS/WAXS Employing Synchrotron Radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katerska, B.; Krasteva, M.; Perez, E.

    2007-04-23

    Real-time small and wide angle X-ray scattering as well as DSC studies were carried out in order to analyzes the structure and phase transitions of liquid crystalline thermotropic poly(methylene p,p' bibenzoat)

  9. 77 FR 42498 - Information Collection(s) Being Submitted for Review and Approval to the Office of Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-19

    ...: Section 80.103, Digital Selective Calling (DSC) Operating Procedures--Maritime Mobile Identity (MMSI...: Individuals or households; business or other for- profit entities and Federal Government. Number of... Marine VHF radios with Digital Selective Calling (DSC) capability in this collection. The licensee...

  10. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  11. The Astrophysics Source Code Library: Where Do We Go from Here?

    NASA Astrophysics Data System (ADS)

    Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J.

    2014-05-01

    The Astrophysics Source Code Library1, started in 1999, has in the past three years grown from a repository for 40 codes to a registry of over 700 codes that are now indexed by ADS. What comes next? We examine the future of the , the challenges facing it, the rationale behind its practices, and the need to balance what we might do with what we have the resources to accomplish.

  12. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  13. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review

    PubMed Central

    Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-01-01

    Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883

  14. Image authentication using distributed source coding.

    PubMed

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  15. Aquarius Project: Research in the System Architecture of Accelerators for the High Performance Execution of Logic Programs.

    DTIC Science & Technology

    1991-05-31

    benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution

  16. Opening up Architectures of Software-Intensive Systems: A Functional Decomposition to Support System Comprehension

    DTIC Science & Technology

    2007-10-01

    Architecture ................................................................................ 14 Figure 2. Eclipse Java Model...16 Figure 3. Eclipse Java Model at the Source Code Level...24 Figure 9. Java Source Code

  17. Scalable Video Transmission Over Multi-Rate Multiple Access Channels

    DTIC Science & Technology

    2007-06-01

    Rate - compatible punctured convolutional codes (RCPC codes ) and their ap- plications,” IEEE...source encoded using the MPEG-4 video codec. The source encoded bitstream is then channel encoded with Rate Compatible Punctured Convolutional (RCPC...Clark, and J. M. Geist, “ Punctured convolutional codes or rate (n-1)/n and simplified maximum likelihood decoding,” IEEE Transactions on

  18. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    PubMed Central

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  19. Analytical solutions by squeezing to the anisotropic Rabi model in the nonperturbative deep-strong-coupling regime

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Yu; Chen, Xiang-You

    2017-12-01

    An unexplored nonperturbative deep strong coupling (npDSC) achieved in superconducting circuits has been studied in the anisotropic Rabi model by the generalized squeezing rotating-wave approximation. Energy levels are evaluated analytically from the reformulated Hamiltonian and agree well with numerical ones in a wide range of coupling strength. Such improvement ascribes to deformation effects in the displaced-squeezed state presented by the squeezed momentum variance, which are omitted in previous displaced states. The atom population dynamics confirms the validity of our approach for the npDSC strength. Our approach offers the possibility to explore interesting phenomena analytically in the npDSC regime in qubit-oscillator experiments.

  20. Summary of Results from the Mars Phoenix Lander's Thermal Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Sutter, B.; Ming, D. W.; Boynton, W. V.; Niles, P. B.; Hoffman, J.; Lauer, H. V.; Golden, D. C.

    2009-01-01

    The Mars Phoenix Scout Mission with its diverse instrument suite successfully examined several soils on the Northern plains of Mars. The Thermal and Evolved Gas Analyzer (TEGA) was employed to detect evolved volatiles and organic and inorganic materials by coupling a differential scanning calorimeter (DSC) with a magnetic-sector mass spectrometer (MS) that can detect masses in the 2 to 140 dalton range [1]. Five Martian soils were individually heated to 1000 C in the DSC ovens where evolved gases from mineral decompostion products were examined with the MS. TEGA s DSC has the capability to detect endothermic and exothermic reactions during heating that are characteristic of minerals present in the Martian soil.

  1. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  2. ROS Hexapod

    NASA Technical Reports Server (NTRS)

    Davis, Kirsch; Bankieris, Derek

    2016-01-01

    As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.

  3. Solder doped polycaprolactone scaffold enables reproducible laser tissue soldering.

    PubMed

    Bregy, Amadé; Bogni, Serge; Bernau, Vianney J P; Vajtai, Istvan; Vollbach, Felix; Petri-Fink, Alke; Constantinescu, Mihai; Hofmann, Heinrich; Frenz, Martin; Reinert, Michael

    2008-12-01

    In this in vitro feasibility study we analyzed tissue fusion using bovine serum albumin (BSA) and Indocyanine green (ICG) doped polycaprolactone (PCL) scaffolds in combination with a diode laser as energy source while focusing on the influence of irradiation power and albumin concentration on the resulting tensile strength and induced tissue damage. A porous PCL scaffold doped with either 25% or 40% (w/w) of BSA in combination with 0.1% (w/w) ICG was used to fuse rabbit aortas. Soldering energy was delivered through the vessel from the endoluminal side using a continuous wave diode laser at 808 nm via a 400 microm core fiber. Scaffold surface temperatures were analyzed with an infrared camera. Optimum parameters such as irradiation time, radiation power and temperature were determined in view of maximum tensile strength but simultaneously minimum thermally induced tissue damage. Differential scanning calorimetry (DSC) was performed to measure the influence of PCL on the denaturation temperature of BSA. Optimum parameter settings were found to be 60 seconds irradiation time and 1.5 W irradiation power resulting in tensile strengths of around 2,000 mN. Corresponding scaffold surface temperature was 117.4+/- 12 degrees C. Comparison of the two BSA concentration revealed that 40% BSA scaffold resulted in significant higher tensile strength compared to the 25%. At optimum parameter settings, thermal damage was restricted to the adventitia and its interface with the outermost layer of the tunica media. The DSC showed two endothermic peaks in BSA containing samples, both strongly depending on the water content and the presence of PCL and/or ICG. Diode laser soldering of vascular tissue using BSA-ICG-PCL-scaffolds leads to strong and reproducible tissue bonds, with vessel damage limited to the adventitia. Higher BSA content results in higher tensile strengths. The DSC-measurements showed that BSA denaturation temperature is lowered by addition of water and/or ICG-PCL. (c) 2008 Wiley-Liss, Inc.

  4. Runtime Detection of C-Style Errors in UPC Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirkelbauer, P; Liao, C; Panas, T

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less

  5. Drill Sergeant Candidate Transformation

    DTIC Science & Technology

    2009-02-01

    leadership styles of NCOs entering Drill Sergeant School (DSS). ARI also developed and administered a prototype DS Assessment Battery to assess...preferred leadership styles . DSS training increases both the degree to which the DSC feels obligated to and identifies with the Army. DSS training...4 TABLE 3. PREFERRED LEADERSHIP STYLES DEFINITIONS .............................................6 TABLE 4. DSC CHANGE IN

  6. Structural basis of host recognition and biofilm formation by Salmonella Saf pili

    PubMed Central

    2017-01-01

    Pili are critical in host recognition, colonization and biofilm formation during bacterial infection. Here, we report the crystal structures of SafD-dsc and SafD-SafA-SafA (SafDAA-dsc) in Saf pili. Cell adherence assays show that SafD and SafA are both required for host recognition, suggesting a poly-adhesive mechanism for Saf pili. Moreover, the SafDAA-dsc structure, as well as SAXS characterization, reveals an unexpected inter-molecular oligomerization, prompting the investigation of Saf-driven self-association in biofilm formation. The bead/cell aggregation and biofilm formation assays are used to demonstrate the novel function of Saf pili. Structure-based mutants targeting the inter-molecular hydrogen bonds and complementary architecture/surfaces in SafDAA-dsc dimers significantly impaired the Saf self-association activity and biofilm formation. In summary, our results identify two novel functions of Saf pili: the poly-adhesive and self-associating activities. More importantly, Saf-Saf structures and functional characterizations help to define a pili-mediated inter-cellular oligomerizaiton mechanism for bacterial aggregation, colonization and ultimate biofilm formation. PMID:29125121

  7. Study on Synthesis of Thoreau-modified 3, 5-Dimethyl-Thioltoluenediamine Used as Epoxy Resin Curing Agent and Its Performance

    NASA Astrophysics Data System (ADS)

    Peng, Yongli; Xiao, Wenzheng

    2017-06-01

    A novel curing agent Thoreau modified 3, 5-Dimethyl-thioltoluenediamine was synthesized and its molecular structure was characterized by FTIR and DSC. The curing kinetics of a high toughness and low volume shrinkage ratio epoxy system (modified DMTDA/DGEBA) was studied by differential scanning calorimetry (DSC) under noni so thermal conditions. The data were fitted to an order model and autocatalytic model respectively. The results indicate that in order model deviates significantly from experimental data. Malik’s method was used to prove that the curing kinetics of the system concerned follow single-step autocatalytic model, and a “single-point model-free” approach was employed to calculate meaningful kinetic parameters. The DSC curves derived from autocatalytic model gave satisfactory agreement with that of experiment in the range 5K/min∼25K/min. As the heating rate increased, the predicted DSC curves deviated from experimental curves, and the total exothermic enthalpy declined owing to the transition of competition relationship between kinetics control and diffusion control.

  8. Thermal analysis of hydroxypropylmethylcellulose and methylcellulose: powders, gels and matrix tablets.

    PubMed

    Ford, J L

    1999-03-15

    This review focuses on the thermal analysis of hydroxypropylmethylcellulose (HPMC) and methylcellulose. Differential scanning calorimetry (DSC) of their powders is used to determine temperatures of moisture loss (in conjunction with thermogravimetric analysis) and glass transition temperatures. However, sample preparation and encapsulation affect the values obtained. The interaction of these cellulose ethers with water is evaluated by DSC. Water is added to the powder directly in DSC pans or preformed gels can be evaluated. Data quality depends on previous thermal history but estimates of the quantity of water bound to the polymers may be made. Water uptake by cellulose ethers may be evaluated by the use of polymeric wafers and by following loss of free water, over a series of timed curves, into wafers in contact with water. Cloud points, which assess the reduction of polymer solubility with increase of temperature, may be assessed spectrophotometrically. DSC and rheometric studies are used to follow thermogelation, a process involving hydrophobic interaction between partly hydrated polymeric chains. The advantages and disadvantages of the various methodologies are highlighted. Copyright.

  9. Effect of the diffusion parameters on the observed γ-ray spectrum of sources and their contribution to the local all-electron spectrum: The EDGE code

    NASA Astrophysics Data System (ADS)

    López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.

    2018-11-01

    The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].

  10. Cerebral perfusion alterations in epileptic patients during peri-ictal and post-ictal phase: PASL vs DSC-MRI.

    PubMed

    Pizzini, Francesca B; Farace, Paolo; Manganotti, Paolo; Zoccatelli, Giada; Bongiovanni, Luigi G; Golay, Xavier; Beltramello, Alberto; Osculati, Antonio; Bertini, Giuseppe; Fabene, Paolo F

    2013-07-01

    Non-invasive pulsed arterial spin labeling (PASL) MRI is a method to study brain perfusion that does not require the administration of a contrast agent, which makes it a valuable diagnostic tool as it reduces cost and side effects. The purpose of the present study was to establish the viability of PASL as an alternative to dynamic susceptibility contrast (DSC-MRI) and other perfusion imaging methods in characterizing changes in perfusion patterns caused by seizures in epileptic patients. We evaluated 19 patients with PASL. Of these, the 9 affected by high-frequency seizures were observed during the peri-ictal period (within 5hours since the last seizure), while the 10 patients affected by low-frequency seizures were observed in the post-ictal period. For comparison, 17/19 patients were also evaluated with DSC-MRI and CBF/CBV. PASL imaging showed focal vascular changes, which allowed the classification of patients in three categories: 8 patients characterized by increased perfusion, 4 patients with normal perfusion and 7 patients with decreased perfusion. PASL perfusion imaging findings were comparable to those obtained by DSC-MRI. Since PASL is a) sensitive to vascular alterations induced by epileptic seizures, b) comparable to DSC-MRI for detecting perfusion asymmetries, c) potentially capable of detecting time-related perfusion changes, it can be recommended for repeated evaluations, to identify the epileptic focus, and in follow-up and/or therapy-response assessment. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. The Diabetes Initiative of South Carolina Celebrates Over 20 Years of Professional Diabetes Education.

    PubMed

    Hermayer, Kathie L

    2016-04-01

    Diabetes is a major public health problem in South Carolina; however, the Diabetes Initiative of South Carolina (DSC) provides a realistic mechanism to address issues on a statewide basis. The Diabetes Center of Excellence in the DSC provides oversight for developing and supervising professional education programs for health care workers of all types in South Carolina to increase their knowledge and ability to care for people with diabetes. The DSC has developed many programs for the education of a variety of health professionals about diabetes and its complications. The DSC has sponsored 21 Annual Diabetes Fall Symposia for primary health care professionals featuring education regarding many aspects of diabetes mellitus. The intent of the program is to enhance the lifelong learning process of physicians, advanced practice providers, nurses, pharmacists, dietitians, laboratorians and other health care professionals, by providing educational opportunities and to advance the quality and safety of patient care. The symposium is an annual 2-day statewide program that supplies both a comprehensive diabetes management update to all primary care professionals and an opportunity for attendees to obtain continuing education credits at a low cost. The overarching goal of the DSC is that the programs it sponsors and the development of new targeted initiatives will lead to continuous improvements in the care of people at risk and with diabetes along with a decrease in morbidity, mortality and costs of diabetes and its complications in South Carolina and elsewhere. Published by Elsevier Inc.

  12. Measuring the glass transition temperature of EPDM roofing materials: Comparison of DMA, TMA, and DSC techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paroli, R.M.; Penn, J.

    1994-09-01

    Two ethylene-propylene-diene monomer (EPDM) roofing membranes were aged at 100 C for 7 and 28 days. The T{sub g} of these membranes was then determined by dynamic mechanical analysis (DMA), thermomechanical analysis (TMA), and differential scanning calorimetry (DSC) and the results compared. It was found that: (1) T{sub g} data can be obtained easily using the DMA and TMA techniques. The DSC method requires greater care due to the broad step change in the baseline which is associated with heavily plasticized materials. (2) The closest correspondence between techniques was for TMA and DSC (half-height). The latter, within experimental error, yieldedmore » the same glass transition temperature before and after heat-aging. (3) The peak maxima associated with tan{delta} and E{double_prime} measurements should be cited with T{sub g} values as significant differences can exist. (4) The T{sub g}(E{double_prime}) values were closer to the T{sub g}(TMA) and T{sub g}(DSC) data than were the T{sub g}(tan{delta}) values. Data obtained at 1 Hz (or possibly less) should be used when making comparisons based on various techniques. An assessment of T{sub g} values indicated that EPDM 112 roofing membrane is more stable than the EPDM 111 membrane. The T{sub g} for EPDM 112 did not change significantly with heat-aging for 28 days at 130 C.« less

  13. Designated Stroke Center Status and Hospital Characteristics as Predictors of In-Hospital Mortality among Hemorrhagic Stroke Patients in New York, 2008-2012.

    PubMed

    Gatollari, Hajere J; Colello, Anna; Eisenberg, Bonnie; Brissette, Ian; Luna, Jorge; Elkind, Mitchell S V; Willey, Joshua Z

    2017-01-01

    Although designated stroke centers (DSCs) improve the quality of care and clinical outcomes for ischemic stroke patients, less is known about the benefits of DSCs for patients with intracerebral hemorrhage (ICH) and subarachnoid hemorrhage (SAH). Compared to non-DSCs, hospitals with the DSC status have lower in-hospital mortality rates for hemorrhagic stroke patients. We believed these effects would sustain over a period of time after adjusting for hospital-level characteristics, including hospital size, urban location, and teaching status. We evaluated ICH (International Classification of Diseases, Ninth Revision; ICD-9: 431) and SAH (ICD-9: 430) hospitalizations documented in the 2008-2012 New York State Department of Health Statewide Planning and Research Cooperative System inpatient sample database. Generalized estimating equation logistic regression was used to evaluate the association between DSC status and in-hospital mortality. We calculated ORs and 95% CIs adjusted for clustering of patients within facilities, other hospital characteristics, and individual level characteristics. Planned secondary analyses explored other hospital characteristics associated with in-hospital mortality. In 6,352 ICH and 3,369 SAH patients in the study sample, in-hospital mortality was higher among those with ICH compared to SAH (23.7 vs. 18.5%). Unadjusted analyses revealed that DSC status was related with reduced mortality for both ICH (OR 0.7, 95% CI 0.5-0.8) and SAH patients (OR 0.4, 95% CI 0.3-0.7). DSC remained a significant predictor of lower in-hospital mortality for SAH patients (OR 0.6, 95% CI 0.3-0.9) but not for ICH patients (OR 0.8, 95% CI 0.6-1.0) after adjusting for patient demographic characteristics, comorbidities, hospital size, teaching status and location. Admission to a DSC was independently associated with reduced in-hospital mortality for SAH patients but not for those with ICH. Other patient and hospital characteristics may explain the benefits of DSC status on outcomes after ICH. For conditions with clear treatments such as ischemic stroke and SAH, being treated in a DSC improves outcomes, but this trend was not observed in those with strokes, in those who did not have clear treatment guidelines. Identifying hospital-level factors associated with ICH and SAH represents a means to identify and improve gaps in stroke systems of care. © 2016 S. Karger AG, Basel.

  14. Admiralty Inlet Advanced Turbulence Measurements: final data and code archive

    DOE Data Explorer

    Kilcher, Levi (ORCID:0000000183851131); Thomson, Jim (ORCID:0000000289290088); Harding, Samuel

    2011-02-01

    Data and code that is not already in a public location that is used in Kilcher, Thomson, Harding, and Nylund (2017) "Turbulence Measurements from Compliant Moorings - Part II: Motion Correction" doi: 10.1175/JTECH-D-16-0213.1. The links point to Python source code used in the publication. All other files are source data used in the publication.

  15. Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part 2. Code Manual

    DTIC Science & Technology

    1979-09-01

    imaging of source axes for magnetic source. Ax R VSOURC(1,1) + 9 VSOURC(1,2) + T VSOURC(1,3) 4pi = x VIMAG(I,1) + ^ VINAG (1,2)+ VIMAG(l,3) An =unit...VNC A. yt and z components of the end cap unit normal OUTPUT VARIABLE VINAG X.. Y, and z components defining thesource image coordinate system axesin

  16. Java Source Code Analysis for API Migration to Embedded Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, Victor; McCoy, James A.; Guerrero, Jonathan

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less

  17. Social Media Impact of the Food and Drug Administration's Drug Safety Communication Messaging About Zolpidem: Mixed-Methods Analysis.

    PubMed

    Sinha, Michael S; Freifeld, Clark C; Brownstein, John S; Donneyong, Macarius M; Rausch, Paula; Lappin, Brian M; Zhou, Esther H; Dal Pan, Gerald J; Pawar, Ajinkya M; Hwang, Thomas J; Avorn, Jerry; Kesselheim, Aaron S

    2018-01-05

    The Food and Drug Administration (FDA) issues drug safety communications (DSCs) to health care professionals, patients, and the public when safety issues emerge related to FDA-approved drug products. These safety messages are disseminated through social media to ensure broad uptake. The objective of this study was to assess the social media dissemination of 2 DSCs released in 2013 for the sleep aid zolpidem. We used the MedWatcher Social program and the DataSift historic query tool to aggregate Twitter and Facebook posts from October 1, 2012 through August 31, 2013, a period beginning approximately 3 months before the first DSC and ending 3 months after the second. Posts were categorized as (1) junk, (2) mention, and (3) adverse event (AE) based on a score between -0.2 (completely unrelated) to 1 (perfectly related). We also looked at Google Trends data and Wikipedia edits for the same time period. Google Trends search volume is scaled on a range of 0 to 100 and includes "Related queries" during the relevant time periods. An interrupted time series (ITS) analysis assessed the impact of DSCs on the counts of posts with specific mention of zolpidem-containing products. Chow tests for known structural breaks were conducted on data from Twitter, Facebook, and Google Trends. Finally, Wikipedia edits were pulled from the website's editorial history, which lists all revisions to a given page and the editor's identity. In total, 174,286 Twitter posts and 59,641 Facebook posts met entry criteria. Of those, 16.63% (28,989/174,286) of Twitter posts and 25.91% (15,453/59,641) of Facebook posts were labeled as junk and excluded. AEs and mentions represented 9.21% (16,051/174,286) and 74.16% (129,246/174,286) of Twitter posts and 5.11% (3,050/59,641) and 68.98% (41,138/59,641) of Facebook posts, respectively. Total daily counts of posts about zolpidem-containing products increased on Twitter and Facebook on the day of the first DSC; Google searches increased on the week of the first DSC. ITS analyses demonstrated variability but pointed to an increase in interest around the first DSC. Chow tests were significant (P<.0001) for both DSCs on Facebook and Twitter, but only the first DSC on Google Trends. Wikipedia edits occurred soon after each DSC release, citing news articles rather than the DSC itself and presenting content that needed subsequent revisions for accuracy. Social media offers challenges and opportunities for dissemination of the DSC messages. The FDA could consider strategies for more actively disseminating DSC safety information through social media platforms, particularly when announcements require updating. The FDA may also benefit from directly contributing content to websites like Wikipedia that are frequently accessed for drug-related information. ©Michael S Sinha, Clark C Freifeld, John S Brownstein, Macarius M Donneyong, Paula Rausch, Brian M Lappin, Esther H Zhou, Gerald J Dal Pan, Ajinkya M Pawar, Thomas J Hwang, Jerry Avorn, Aaron S Kesselheim. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 05.01.2018.

  18. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  19. Thermal and dynamic mechanical properties of hydroxypropyl cellulose films

    Treesearch

    Timothy G. Rials; Wolfgang G. Glasser

    1988-01-01

    Differential scanning calorimetry (DSC) and dynamic mechanical thermal analysis (DMTA) were used to characterize the morphology of slovent cast hydroxypropyl cellulose (HPC) films. DSC results were indicative of a semicrystalline material with a melt of 220°C and a glass transition at 19°C (T1), although an additional event was suggested by a...

  20. The Structure of Mother-Child Play: Young Children with Down Syndrome and Typically Developing Children.

    ERIC Educational Resources Information Center

    Roach, Mary A.; Barratt, Marguerite Stevenson; Miller, Jon F.; Leavitt, Lewis A.

    1998-01-01

    Compared mothers' play with infants with Down syndrome (DSC) and typically developing children (TDC) matched for mental or chronological age. Found that TDC mothers exhibited more object demonstrations with their developmentally younger children, who showed less object play. DSC mothers were more directive and supportive than mothers of younger…

  1. Among the Few at Deep Springs College: Assessing a Seven-Decade Experiment in Liberal Education.

    ERIC Educational Resources Information Center

    Newell, L. Jackson

    1982-01-01

    Describes the origins and characteristics of Deep Springs College (DSC), which since 1917 has teamed liberal arts instruction with the physical labor of running a cattle ranch. Uses alumni survey responses to assess the long-term effects of attending DSC. Examines paradoxes inherent in the school and its future prospects. (DMM)

  2. 47 CFR 80.1087 - Ship radio equipment-Sea area A1.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... which the ship is normally navigated, operating either: (1) On VHF using DSC; or (2) Through the polar...; or (4) On HF using DSC; or (5) Through the INMARSAT geostationary satellite service if within... communication. (b) The VHF radio installation, required by § 80.1085(a)(1), must also be capable of transmitting...

  3. 47 CFR 80.1087 - Ship radio equipment-Sea area A1.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... which the ship is normally navigated, operating either: (1) On VHF using DSC; or (2) Through the polar...; or (4) On HF using DSC; or (5) Through the INMARSAT geostationary satellite service if within... communication. (b) The VHF radio installation, required by § 80.1085(a)(1), must also be capable of transmitting...

  4. 47 CFR 80.103 - Digital selective calling (DSC) operating procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DSC “Acknowledgment of distress calls” and “Distress relays.” (See subpart W of this part.) (d) Group calls to vessels under the common control of a single entity are authorized. A group call identity may... (ITU), Place des Nations, CH-1211 Geneva 20, Switzerland. [68 FR 46961, Aug. 7, 2003, as amended at 73...

  5. 47 CFR 80.103 - Digital selective calling (DSC) operating procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DSC “Acknowledgment of distress calls” and “Distress relays.” (See subpart W of this part.) (d) Group calls to vessels under the common control of a single entity are authorized. A group call identity may... (ITU), Place des Nations, CH-1211 Geneva 20, Switzerland. [68 FR 46961, Aug. 7, 2003, as amended at 73...

  6. Synthesis of plutonium trifluoride by hydro-fluorination and novel thermodynamic data for the PuF3-LiF system

    NASA Astrophysics Data System (ADS)

    Tosolin, A.; Souček, P.; Beneš, O.; Vigier, J.-F.; Luzzi, L.; Konings, R. J. M.

    2018-05-01

    PuF3 was synthetized by hydro-fluorination of PuO2 and subsequent reduction of the product by hydrogenation. The obtained PuF3 was analysed by X-Ray Diffraction (XRD) and found phase-pure. High purity was also confirmed by the melting point analysis using Differential Scanning Calorimetry (DSC). PuF3 was then used for thermodynamic assessment of the PuF3-LiF system. Phase equilibrium points and enthalpy of fusion of the eutectic composition were measured by DSC. XRD analyses of selected samples after DSC measurement confirm that after solidification from the liquid, the system returns to a mixture of LiF and PuF3.

  7. Dynamic Synchronous Capture Algorithm for an Electromagnetic Flowmeter.

    PubMed

    Fanjiang, Yong-Yi; Lu, Shih-Wei

    2017-04-10

    This paper proposes a dynamic synchronous capture (DSC) algorithm to calculate the flow rate for an electromagnetic flowmeter. The characteristics of the DSC algorithm can accurately calculate the flow rate signal and efficiently convert an analog signal to upgrade the execution performance of a microcontroller unit (MCU). Furthermore, it can reduce interference from abnormal noise. It is extremely steady and independent of fluctuations in the flow measurement. Moreover, it can calculate the current flow rate signal immediately (m/s). The DSC algorithm can be applied to the current general MCU firmware platform without using DSP (Digital Signal Processing) or a high-speed and high-end MCU platform, and signal amplification by hardware reduces the demand for ADC accuracy, which reduces the cost.

  8. Neural network-based adaptive dynamic surface control for permanent magnet synchronous motors.

    PubMed

    Yu, Jinpeng; Shi, Peng; Dong, Wenjie; Chen, Bing; Lin, Chong

    2015-03-01

    This brief considers the problem of neural networks (NNs)-based adaptive dynamic surface control (DSC) for permanent magnet synchronous motors (PMSMs) with parameter uncertainties and load torque disturbance. First, NNs are used to approximate the unknown and nonlinear functions of PMSM drive system and a novel adaptive DSC is constructed to avoid the explosion of complexity in the backstepping design. Next, under the proposed adaptive neural DSC, the number of adaptive parameters required is reduced to only one, and the designed neural controllers structure is much simpler than some existing results in literature, which can guarantee that the tracking error converges to a small neighborhood of the origin. Then, simulations are given to illustrate the effectiveness and potential of the new design technique.

  9. Dynamic Synchronous Capture Algorithm for an Electromagnetic Flowmeter

    PubMed Central

    Fanjiang, Yong-Yi; Lu, Shih-Wei

    2017-01-01

    This paper proposes a dynamic synchronous capture (DSC) algorithm to calculate the flow rate for an electromagnetic flowmeter. The characteristics of the DSC algorithm can accurately calculate the flow rate signal and efficiently convert an analog signal to upgrade the execution performance of a microcontroller unit (MCU). Furthermore, it can reduce interference from abnormal noise. It is extremely steady and independent of fluctuations in the flow measurement. Moreover, it can calculate the current flow rate signal immediately (m/s). The DSC algorithm can be applied to the current general MCU firmware platform without using DSP (Digital Signal Processing) or a high-speed and high-end MCU platform, and signal amplification by hardware reduces the demand for ADC accuracy, which reduces the cost. PMID:28394306

  10. Differences in the causes of death of HIV-positive patients in a cohort study by data sources and coding algorithms.

    PubMed

    Hernando, Victoria; Sobrino-Vegas, Paz; Burriel, M Carmen; Berenguer, Juan; Navarro, Gemma; Santos, Ignacio; Reparaz, Jesús; Martínez, M Angeles; Antela, Antonio; Gutiérrez, Félix; del Amo, Julia

    2012-09-10

    To compare causes of death (CoDs) from two independent sources: National Basic Death File (NBDF) and deaths reported to the Spanish HIV Research cohort [Cohort de adultos con infección por VIH de la Red de Investigación en SIDA CoRIS)] and compare the two coding algorithms: International Classification of Diseases, 10th revision (ICD-10) and revised version of Coding Causes of Death in HIV (revised CoDe). Between 2004 and 2008, CoDs were obtained from the cohort records (free text, multiple causes) and also from NBDF (ICD-10). CoDs from CoRIS were coded according to ICD-10 and revised CoDe by a panel. Deaths were compared by 13 disease groups: HIV/AIDS, liver diseases, malignancies, infections, cardiovascular, blood disorders, pulmonary, central nervous system, drug use, external, suicide, other causes and ill defined. There were 160 deaths. Concordance for the 13 groups was observed in 111 (69%) cases for the two sources and in 115 (72%) cases for the two coding algorithms. According to revised CoDe, the commonest CoDs were HIV/AIDS (53%), non-AIDS malignancies (11%) and liver related (9%), these percentages were similar, 57, 10 and 8%, respectively, for NBDF (coded as ICD-10). When using ICD-10 to code deaths in CoRIS, wherein HIV infection was known in everyone, the proportion of non-AIDS malignancies was 13%, liver-related accounted for 3%, while HIV/AIDS reached 70% due to liver-related, infections and ill-defined causes being coded as HIV/AIDS. There is substantial variation in CoDs in HIV-infected persons according to sources and algorithms. ICD-10 in patients known to be HIV-positive overestimates HIV/AIDS-related deaths at the expense of underestimating liver-related diseases, infections and ill defined causes. CoDe seems as the best option for cohort studies.

  11. Source Methodology for Turbofan Noise Prediction (SOURCE3D Technical Documentation)

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides the analytical documentation for the SOURCE3D Rotor Wake/Stator Interaction Code. It derives the equations for the rotor scattering coefficients and stator source vector and scattering coefficients that are needed for use in the TFANS (Theoretical Fan Noise Design/Prediction System). SOURCE3D treats the rotor and stator as isolated source elements. TFANS uses this information, along with scattering coefficients for inlet and exit elements, and provides complete noise solutions for turbofan engines. SOURCE3D is composed of a collection of FORTRAN programs that have been obtained by extending the approach of the earlier V072 Rotor Wake/Stator Interaction Code. Similar to V072, it treats the rotor and stator as a collection of blades and vanes having zero thickness and camber contained in an infinite, hardwall annular duct. SOURCE3D adds important features to the V072 capability-a rotor element, swirl flow and vorticity waves, actuator disks for flow turning, and combined rotor/actuator disk and stator/actuator disk elements. These items allow reflections from the rotor, frequency scattering, and mode trapping, thus providing more complete noise predictions than previously. The code has been thoroughly verified through comparison with D.B. Hanson's CUP2D two- dimensional code using a narrow annulus test case.

  12. The Cortical Organization of Speech Processing: Feedback Control and Predictive Coding the Context of a Dual-Stream Model

    ERIC Educational Resources Information Center

    Hickok, Gregory

    2012-01-01

    Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…

  13. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  14. Hypersonic simulations using open-source CFD and DSMC solvers

    NASA Astrophysics Data System (ADS)

    Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.

    2016-11-01

    Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.

  15. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  16. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.

    PubMed

    Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-03-31

    As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 31.03.2017.

  17. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  18. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  19. Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique

    NASA Technical Reports Server (NTRS)

    Tiampo, Kristy F.

    1999-01-01

    In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.

  20. Scalable video transmission over Rayleigh fading channels using LDPC codes

    NASA Astrophysics Data System (ADS)

    Bansal, Manu; Kondi, Lisimachos P.

    2005-03-01

    In this paper, we investigate an important problem of efficiently utilizing the available resources for video transmission over wireless channels while maintaining a good decoded video quality and resilience to channel impairments. Our system consists of the video codec based on 3-D set partitioning in hierarchical trees (3-D SPIHT) algorithm and employs two different schemes using low-density parity check (LDPC) codes for channel error protection. The first method uses the serial concatenation of the constant-rate LDPC code and rate-compatible punctured convolutional (RCPC) codes. Cyclic redundancy check (CRC) is used to detect transmission errors. In the other scheme, we use the product code structure consisting of a constant rate LDPC/CRC code across the rows of the `blocks' of source data and an erasure-correction systematic Reed-Solomon (RS) code as the column code. In both the schemes introduced here, we use fixed-length source packets protected with unequal forward error correction coding ensuring a strictly decreasing protection across the bitstream. A Rayleigh flat-fading channel with additive white Gaussian noise (AWGN) is modeled for the transmission. The rate-distortion optimization algorithm is developed and carried out for the selection of source coding and channel coding rates using Lagrangian optimization. The experimental results demonstrate the effectiveness of this system under different wireless channel conditions and both the proposed methods (LDPC+RCPC/CRC and RS+LDPC/CRC) outperform the more conventional schemes such as those employing RCPC/CRC.

  1. Cardiorespiratory fitness does not alter plasma pentraxin 3 and cortisol reactivity to acute psychological stress and exercise.

    PubMed

    Huang, Chun-Jung; Webb, Heather E; Beasley, Kathleen N; McAlpine, David A; Tangsilsat, Supatchara E; Acevedo, Edmund O

    2014-03-01

    Pentraxin 3 (PTX3) has been recently identified as a biomarker of vascular inflammation in predicting cardiovascular events. The purpose of this study was to examine the effect of cardiorespiratory fitness on plasma PTX3 and cortisol responses to stress, utilizing a dual-stress model. Fourteen male subjects were classified into high-fit (HF) and low-fit (LF) groups and completed 2 counterbalanced experimental conditions. The exercise-alone condition (EAC) consisted of cycling at 60% maximal oxygen uptake for 37 min, while the dual-stress condition (DSC) included 20 min of a mental stress while cycling for 37 min. Plasma PTX3 revealed significant increases over time with a significant elevation at 37 min in both HF and LF groups in response to EAC and DSC. No difference in plasma PTX3 levels was observed between EAC and DSC. In addition, plasma cortisol revealed a significant condition by time interaction with greater levels during DSC at 37 min, whereas cardiorespiratory fitness level did not reveal different plasma cortisol responses in either the EAC or DSC. Aerobic exercise induces plasma PTX3 release, while additional acute mental stress, in a dual-stress condition, does not exacerbate or further modulate the PTX3 response. Furthermore, cardiorespiratory fitness may not affect the stress reactivity of plasma PTX3 to physical and combined physical and psychological stressors. Finally, the exacerbated cortisol responses to combined stress may provide the potential link to biological pathways that explain changes in physiological homeostasis that may be associated with an increase in the risk of cardiovascular disease.

  2. Use of differential scanning calorimetry to detect canola oil (Brassica napus L.) adulterated with lard stearin.

    PubMed

    Marikkar, Jalaldeen Mohammed Nazrim; Rana, Sohel

    2014-01-01

    A study was conducted to detect and quantify lard stearin (LS) content in canola oil (CaO) using differential scanning calorimetry (DSC). Authentic samples of CaO were obtained from a reliable supplier and the adulterant LS were obtained through a fractional crystallization procedure as reported previously. Pure CaO samples spiked with LS in levels ranging from 5 to 15% (w/w) were analyzed using DSC to obtain their cooling and heating profiles. The results showed that samples contaminated with LS at 5% (w/w) level can be detected using characteristic contaminant peaks appearing in the higher temperature regions (0 to 70°C) of the cooling and heating curves. Pearson correlation analysis of LS content against individual DSC parameters of the adulterant peak namely peak temperature, peak area, peak onset temperature indicated that there were strong correlations between these with the LS content of the CaO admixtures. When these three parameters were engaged as variables in the execution of the stepwise regression procedure, predictive models for determination of LS content in CaO were obtained. The predictive models obtained with single DSC parameter had relatively lower coefficient of determination (R(2) value) and higher standard error than the models obtained using two DSC parameters in combination. This study concluded that the predictive models obtained with peak area and peak onset temperature of the adulteration peak would be more accurate for prediction of LS content in CaO based on the highest coefficient of determination (R(2) value) and smallest standard error.

  3. Quantitative blood flow measurements in gliomas using arterial spin-labeling at 3T: intermodality agreement and inter- and intraobserver reproducibility study.

    PubMed

    Hirai, T; Kitajima, M; Nakamura, H; Okuda, T; Sasao, A; Shigematsu, Y; Utsunomiya, D; Oda, S; Uetani, H; Morioka, M; Yamashita, Y

    2011-12-01

    QUASAR is a particular application of the ASL method and facilitates the user-independent quantification of brain perfusion. The purpose of this study was to assess the intermodality agreement of TBF measurements obtained with ASL and DSC MR imaging and the inter- and intraobserver reproducibility of glioma TBF measurements acquired by ASL at 3T. Two observers independently measured TBF in 24 patients with histologically proved glioma. ASL MR imaging with QUASAR and DSC MR imaging were performed on 3T scanners. The observers placed 5 regions of interest in the solid tumor on rCBF maps derived from ASL and DSC MR images and 1 region of interest in the contralateral brain and recorded the measured values. Maximum and average sTBF values were calculated. Intermodality and intra- and interobsever agreement were determined by using 95% Bland-Altman limits of agreement and ICCs. The intermodality agreement for maximum sTBF was good to excellent on DSC and ASL images; ICCs ranged from 0.718 to 0.884. The 95% limits of agreement ranged from 59.2% to 65.4% of the mean. ICCs for intra- and interobserver agreement for maximum sTBF ranged from 0.843 to 0.850 and from 0.626 to 0.665, respectively. The reproducibility of maximum sTBF measurements obtained by methods was similar. In the evaluation of sTBF in gliomas, ASL with QUASAR at 3T yielded measurements and reproducibility similar to those of DSC perfusion MR imaging.

  4. Effects of particle reinforcement and ECAP on the precipitation kinetics of an Al-Cu alloy

    NASA Astrophysics Data System (ADS)

    Härtel, M.; Wagner, S.; Frint, P.; F-X Wagner, M.

    2014-08-01

    The precipitation kinetics of Al-Cu alloys have recently been revisited in various studies, considering either the effect of severe plastic deformation (e.g., by equal-channel angular pressing - ECAP), or the effect of particle reinforcements. However, it is not clear how these effects interact when ECAP is performed on particle-reinforced alloys. In this study, we analyze how a combination of particle reinforcement and ECAP affects precipitation kinetics. After solution annealing, an AA2017 alloy (initial state: base material without particle reinforcement); AA2017 + 10 vol.-% Al2O3; and AA2017 + 10 vol.-% SiC were deformed in one pass in a 120° ECAP tool at a temperature of 140°C. Systematic differential scanning calorimetry (DSC) measurements of each condition were carried out. TEM specimens were prepared out of samples from additional DSC measurements, where the samples were immediately quenched in liquid nitrogen after reaching carefully selected temperatures. TEM analysis was performed to characterize the morphology of the different types of precipitates, and to directly relate microstructural information to the endo- and exothermic peaks in our DSC data. Our results show that both ECAP and particle reinforcement are associated with a shift of exothermic precipitation peaks towards lower temperatures. This effect is even more pronounced when ECAP and particle reinforcement are combined. The DSC data agrees well with our TEM observations of nucleation and morphology of different precipitates, indicating that DSC measurements are an appropriate tool for the analysis of how severe plastic deformation and particle reinforcement affect precipitation kinetics in Al-Cu alloys.

  5. Methodology of decreasing software complexity using ontology

    NASA Astrophysics Data System (ADS)

    DÄ browska-Kubik, Katarzyna

    2015-09-01

    In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.

  6. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  7. Astrophysics Source Code Library -- Now even better!

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2015-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!

  8. Study of statistical coding for digital TV

    NASA Technical Reports Server (NTRS)

    Gardenhire, L. W.

    1972-01-01

    The results are presented for a detailed study to determine a pseudo-optimum statistical code to be installed in a digital TV demonstration test set. Studies of source encoding were undertaken, using redundancy removal techniques in which the picture is reproduced within a preset tolerance. A method of source encoding, which preliminary studies show to be encouraging, is statistical encoding. A pseudo-optimum code was defined and the associated performance of the code was determined. The format was fixed at 525 lines per frame, 30 frames per second, as per commercial standards.

  9. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    NASA Technical Reports Server (NTRS)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  10. 7 CFR 1744.30 - Automatic lien accommodations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... supplemental mortgage is a valid and binding instrument enforceable in accordance with its terms, and recorded...: (1) The borrower has achieved a TIER of not less than 1.5 and a DSC of not less than 1.25 for each of... not less than 2.5 and a DSC of not less than 1.5 for each of the borrower's two fiscal years...

  11. 7 CFR 1744.30 - Automatic lien accommodations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... supplemental mortgage is a valid and binding instrument enforceable in accordance with its terms, and recorded...: (1) The borrower has achieved a TIER of not less than 1.5 and a DSC of not less than 1.25 for each of... not less than 2.5 and a DSC of not less than 1.5 for each of the borrower's two fiscal years...

  12. 47 CFR 80.1077 - Frequencies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... System: Alerting: 406.0-406.1 EPIRBs 406.0-406.1 MHz (Earth-to-space).1544-1545 MHz (space-to-Earth). INMARSAT-E EPIRBs 12 1626.5-1645.5 MHz (Earth-to-space). INMARSAT Ship Earth Stations capable of voice and/or direct printing 1626.5-1645.5 MHz (Earth-to-space). VHF DSC Ch. 70 156.525 MHz. 1 MF/HF DSC 2 2187...

  13. Estimation of Temperature Range for Cryo Cutting of Frozen Mackerel using DSC

    NASA Astrophysics Data System (ADS)

    Okamoto, Kiyoshi; Hagura, Yoshio; Suzuki, Kanichi

    Frozen mackerel flesh was subjected to measurement of its fracture stress (bending energy) in a low temperature range. The optimum conditions for low temperature cutting, "cryo cutting," were estimated from the results of enthalpy changes measured by a differential scanning calorimeter (DSC). There were two enthalpy changes for gross transition on the DSC chart for mackerel, one was at -63°C to -77°C and the other at -96°C to -112°C. Thus we estimated that mackerel was able to cut by bending below -63°C and that there would be a great decrease in bending energy occurring at around -77°C and -112°C. In testing, there were indeed two great decreases of bending energy for the test pieces of mackerel that had been frozen at -40°C, one was at -70°C to -90°C and the other was at -100°C to -120°C. Therefore, the test pieces of mackerel could be cut by bending at -70°C. The results showed that the DSC measurement of mackerel flesh gave a good estimation of the appropriate cutting temperature of mackerel.

  14. Detection of cocrystal formation based on binary phase diagrams using thermal analysis.

    PubMed

    Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Teramura, Toshio; Terada, Katsuhide

    2013-01-01

    Although a number of studies have reported that cocrystals can form by heating a physical mixture of two components, details surrounding heat-induced cocrystal formation remain unclear. Here, we attempted to clarify the thermal behavior of a physical mixture and cocrystal formation in reference to a binary phase diagram. Physical mixtures prepared using an agate mortar were heated at rates of 2, 5, 10, and 30 °C/min using differential scanning calorimetry (DSC). Some mixtures were further analyzed using X-ray DSC and polarization microscopy. When a physical mixture consisting of two components which was capable of cocrystal formation was heated using DSC, an exothermic peak associated with cocrystal formation was detected immediately after an endothermic peak. In some combinations, several endothermic peaks were detected and associated with metastable eutectic melting, eutectic melting, and cocrystal melting. In contrast, when a physical mixture of two components which is incapable of cocrystal formation was heated using DSC, only a single endothermic peak associated with eutectic melting was detected. These experimental observations demonstrated how the thermal events were attributed to phase transitions occurring in a binary mixture and clarified the relationship between exothermic peaks and cocrystal formation.

  15. Time-series modeling and prediction of global monthly absolute temperature for environmental decision making

    NASA Astrophysics Data System (ADS)

    Ye, Liming; Yang, Guixia; Van Ranst, Eric; Tang, Huajun

    2013-03-01

    A generalized, structural, time series modeling framework was developed to analyze the monthly records of absolute surface temperature, one of the most important environmental parameters, using a deterministicstochastic combined (DSC) approach. Although the development of the framework was based on the characterization of the variation patterns of a global dataset, the methodology could be applied to any monthly absolute temperature record. Deterministic processes were used to characterize the variation patterns of the global trend and the cyclic oscillations of the temperature signal, involving polynomial functions and the Fourier method, respectively, while stochastic processes were employed to account for any remaining patterns in the temperature signal, involving seasonal autoregressive integrated moving average (SARIMA) models. A prediction of the monthly global surface temperature during the second decade of the 21st century using the DSC model shows that the global temperature will likely continue to rise at twice the average rate of the past 150 years. The evaluation of prediction accuracy shows that DSC models perform systematically well against selected models of other authors, suggesting that DSC models, when coupled with other ecoenvironmental models, can be used as a supplemental tool for short-term (˜10-year) environmental planning and decision making.

  16. DSC studies to evaluate the impact of bio-oil on cold flow properties and oxidation stability of bio-diesel.

    PubMed

    Garcia-Perez, Manuel; Adams, Thomas T; Goodrum, John W; Das, K C; Geller, Daniel P

    2010-08-01

    This paper describes the use of Differential Scanning Calorimetry (DSC) to evaluate the impact of varying mix ratios of bio-oil (pyrolysis oil) and bio-diesel on the oxidation stability and on some cold flow properties of resulting blends. The bio-oils employed were produced from the semi-continuous Auger pyrolysis of pine pellets and the batch pyrolysis of pine chips. The bio-diesel studied was obtained from poultry fat. The conditions used to prepare the bio-oil/bio-diesel blends as well as some of the fuel properties of these blends are reported. The experimental results suggest that the addition of bio-oil improves the oxidation stability of the resulting blends and modifies the crystallization behavior of unsaturated compounds. Upon the addition of bio-oil an increase in the oxidation onset temperature, as determined by DSC, was observed. The increase in bio-diesel oxidation stability is likely to be due to the presence of hindered phenols abundant in bio-oils. A relatively small reduction in DSC characteristic temperatures which are associated with cold flow properties was also observed but can likely be explained by a dilution effect. (c) 2010 Elsevier Ltd. All rights reserved.

  17. Acta Aeronautica et Astronautica Sinica.

    DTIC Science & Technology

    1982-07-28

    AERONAUTICA ET ASTRONAUTICA SINICA - <,y English pages: 212 _r Source : Acta Aeronautica et Astronautica Sinica, Vol. 2, Nr. 4, December 1981 , . pp. 1...ADVOCATED OR IMPLIED ARE THOSE OP THE SOURCE ANDDO NOT NECESSARILY REFLECT THE POSITION TRANSLATION DIVISION OR OPINION OF THE FOREnjN TECHNOLOGY DI...axial) solution section code 2 Lower Corner Symbols i code of sectional cylindrical coordinate system j,k radial and peripheral codes of solution

  18. Are procedures codes in claims data a reliable indicator of intraoperative splenic injury compared with clinical registry data?

    PubMed

    Stey, Anne M; Ko, Clifford Y; Hall, Bruce Lee; Louie, Rachel; Lawson, Elise H; Gibbons, Melinda M; Zingmond, David S; Russell, Marcia M

    2014-08-01

    Identifying iatrogenic injuries using existing data sources is important for improved transparency in the occurrence of intraoperative events. There is evidence that procedure codes are reliably recorded in claims data. The objective of this study was to assess whether concurrent splenic procedure codes in patients undergoing colectomy procedures are reliably coded in claims data as compared with clinical registry data. Patients who underwent colectomy procedures in the absence of neoplastic diagnosis codes were identified from American College of Surgeons (ACS) NSQIP data linked with Medicare inpatient claims data file (2005 to 2008). A κ statistic was used to assess coding concordance between ACS NSQIP and Medicare inpatient claims, with ACS NSQIP serving as the reference standard. A total of 11,367 colectomy patients were identified from 212 hospitals. There were 114 patients (1%) who had a concurrent splenic procedure code recorded in either ACS NSQIP or Medicare inpatient claims. There were 7 patients who had a splenic injury diagnosis code recorded in either data source. Agreement of splenic procedure codes between the data sources was substantial (κ statistic 0.72; 95% CI, 0.64-0.79). Medicare inpatient claims identified 81% of the splenic procedure codes recorded in ACS NSQIP, and 99% of the patients without a splenic procedure code. It is feasible to use Medicare claims data to identify splenic injuries occurring during colectomy procedures, as claims data have moderate sensitivity and excellent specificity for capturing concurrent splenic procedure codes compared with ACS NSQIP. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Evaluation and utilization of beam simulation codes for the SNS ion source and low energy beam transport developmenta)

    NASA Astrophysics Data System (ADS)

    Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.

    2008-02-01

    Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.

  20. Comparing Single-Point and Multi-point Calibration Methods in Modulated DSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buskirk, Caleb Griffith

    2017-06-14

    Heat capacity measurements for High Density Polyethylene (HDPE) and Ultra-high Molecular Weight Polyethylene (UHMWPE) were performed using Modulated Differential Scanning Calorimetry (mDSC) over a wide temperature range, -70 to 115 °C, with a TA Instruments Q2000 mDSC. The default calibration method for this instrument involves measuring the heat capacity of a sapphire standard at a single temperature near the middle of the temperature range of interest. However, this method often fails for temperature ranges that exceed a 50 °C interval, likely because of drift or non-linearity in the instrument's heat capacity readings over time or over the temperature range. Therefore,more » in this study a method was developed to calibrate the instrument using multiple temperatures and the same sapphire standard.« less

  1. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  2. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  3. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    NASA Astrophysics Data System (ADS)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we present our strategy for developing BEAT and show application examples; especially the effect of including the model prediction uncertainty of the velocity model in following source optimizations: full moment tensor, Mogi source, moderate strike-slip earth-quake.

  4. Optimal bit allocation for hybrid scalable/multiple-description video transmission over wireless channels

    NASA Astrophysics Data System (ADS)

    Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.

    2006-01-01

    In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.

  5. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  6. Solid state characterization of dehydroepiandrosterone.

    PubMed

    Chang, L C; Caira, M R; Guillory, J K

    1995-10-01

    Three polymorphs (forms I-III), a monohydrate (form S2), and three new solvates [4:1 hydrate (form S1), monohydrate (form S3), and methanol half-solvate (form S4)] were isolated and characterized by X-ray powder diffractometry (XRPD), IR spectroscopy, differential scanning calorimetry (DSC), hot stage microscopy, solution calorimetry, and their dissolution rates. A new polymorph, designated as form V, melting at 146.5-148 degrees C, was observed by hot stage microscopy. Our results indicate that only forms I and S4 exhibit reproducible DSC thermograms. Five of the isolated modifications undergo phase transformation on heating, and their DSC thermograms are not reproducible. Interpretation of DSC thermograms was facilitated by use of hot stage microscopy. The identification of each modification is based on XRPD patterns (except forms S3 and S4, for which the XRPD patterns are indistinguishable) and IR spectra. In the IR spectra, a significant difference was observed in the OH stretching region of all seven modifications. In a purity determination study, 5% of a contaminant modification in binary mixtures of several modifications could be detected by use of XRPD. To obtain a better understanding of the thermodynamic properties of these modifications, a series of increasing heating rates and different pan types were used in DSC. According to Burger's rule, forms I-III are monotropic polymorphs with decreasing stability in the order form I > form II > form III. The melting onsets and heats of fusion for forms I-III are 149.1 degrees C, 25.5 kJ/mol; 140.8 degrees C, 24.6 kJ/mol; and 137.8 degrees C, 24.0 kJ/mol, respectively. For form III the heat of fusion was calculated from heat of solution and DSC data. In the case of form S1 the melting point, 127.2 degrees C, was obtained by DSC using a hermetically sealed pan. The relative stabilities of the six modifications stored under high humidity conditions were predicted to be, on the basis of the heat of solution and thermal analysis data, from S2 > form S3 > form S1 > form I > form II > form III. However, the results of the dissolution rate determination were inconsistent with the heat of solution data. The stable form I shows a higher initial dissolution rate than the metastable form II and unstable form III. All modifications were converted into the stable monohydrate, form S2, during the dissolution study, suggesting that the moisture level in solid formulations should be carefully controlled.

  7. Professional Practice and Innovation: Level of Agreement between Coding Sources of Percentage Total Body Surface Area Burnt (%TBSA).

    PubMed

    Watterson, Dina; Cleland, Heather; Picton, Natalie; Simpson, Pam M; Gabbe, Belinda J

    2011-03-01

    The percentage of total body surface area burnt (%TBSA) is a critical measure of burn injury severity and a key predictor of burn injury outcome. This study evaluated the level of agreement between four sources of %TBSA using 120 cases identified through the Victorian State Trauma Registry. Expert clinician, ICD-10-AM, Abbreviated Injury Scale, and burns registry coding were compared using measures of agreement. There was near-perfect agreement (weighted Kappa statistic 0.81-1) between all sources of data, suggesting that ICD-10-AM is a valid source of %TBSA and use of ICD-10-AM codes could reduce the resource used by trauma and burns registries capturing this information.

  8. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  9. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  10. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  11. The study of gamma irradiation effects on poly (glycolic acid)

    NASA Astrophysics Data System (ADS)

    Rao Nakka, Rajeswara; Rao Thumu, Venkatappa; Reddy SVS, Ramana; Rao Buddhiraju, Sanjeeva

    2015-05-01

    We have investigated the effects of gamma irradiation on chemical structure, thermal and morphological properties of biodegradable semi-crystalline poly (glycolic acid) (PGA). PGA samples were subjected to irradiation treatment using a 60Co gamma source with a delivered dose of 30, 60 and 90 kGy, respectively. Gamma irradiation induces cleavage of PGA main chains forming ∼OĊH2 and ĊH2COO∼ radicals in both amorphous and crystalline regions. The free radicals formed in the amorphous region abstract atmospheric oxygen and convert them to peroxy radicals. The peroxy radical causes chain scission at the crystal interface through hydrogen abstraction from methylene groups forming the ∼ĊHCOO∼ (I) radical. Consequently, the observed electron spin resonance (ESR) doublet of irradiated PGA is assigned to (I). The disappearance of the ESR signal above 190°C indicates that free radicals are formed in the amorphous region and decay below the melting temperature of PGA. Fourier transform infrared and optical absorption studies confirm that the ? groups are not influenced by gamma irradiation. Differential scanning calorimetry (DSC) studies showed that the melting temperature of PGA decreased from 212°C to 202°C upon irradiation. Degree of crystallinity increased initially and then decreased with an increase in radiation as per DSC and X-ray diffraction studies. Irradiation produced changes in the physical properties of PGA as well as affecting the morphology of the material.

  12. Towards adaptive radiotherapy for head and neck patients: validation of an in-house deformable registration algorithm

    NASA Astrophysics Data System (ADS)

    Veiga, C.; McClelland, J.; Moinuddin, S.; Ricketts, K.; Modat, M.; Ourselin, S.; D'Souza, D.; Royle, G.

    2014-03-01

    The purpose of this work is to validate an in-house deformable image registration (DIR) algorithm for adaptive radiotherapy for head and neck patients. We aim to use the registrations to estimate the "dose of the day" and assess the need to replan. NiftyReg is an open-source implementation of the B-splines deformable registration algorithm, developed in our institution. We registered a planning CT to a CBCT acquired midway through treatment for 5 HN patients that required replanning. We investigated 16 different parameter settings that previously showed promising results. To assess the registrations, structures delineated in the CT were warped and compared with contours manually drawn by the same clinical expert on the CBCT. This structure set contained vertebral bodies and soft tissue. Dice similarity coefficient (DSC), overlap index (OI), centroid position and distance between structures' surfaces were calculated for every registration, and a set of parameters that produces good results for all datasets was found. We achieve a median value of 0.845 in DSC, 0.889 in OI, error smaller than 2 mm in centroid position and over 90% of the warped surface pixels are distanced less than 2 mm of the manually drawn ones. By using appropriate DIR parameters, we are able to register the planning geometry (pCT) to the daily geometry (CBCT).

  13. Dose rate effects in radiation degradation of polymer-based cable materials

    NASA Astrophysics Data System (ADS)

    Plaček, V.; Bartoníček, B.; Hnát, V.; Otáhal, B.

    2003-08-01

    Cable ageing under the nuclear power plant (NPP) conditions must be effectively managed to ensure that the required plant safety and reliability are maintained throughout the plant service life. Ionizing radiation is one of the main stressors causing age-related degradation of polymer-based cable materials in air. For a given absorbed dose, radiation-induced damage to a polymer in air environment usually depends on the dose rate of the exposure. In this work, the effect of dose rate on the degradation rate has been studied. Three types of NPP cables (with jacket/insulation combinations PVC/PVC, PVC/PE, XPE/XPE) were irradiated at room temperature using 60Co gamma ray source at average dose rates of 7, 30 and 100 Gy/h with the doses up to 590 kGy. The irradiated samples have been tested for their mechanical properties, thermo-oxidative stability (using differential scanning calorimetry, DSC), and density. In the case of PVC and PE samples, the tested properties have shown evident dose rate effects, while the XPE material has shown no noticeable ones. The values of elongation at break and the thermo-oxidative stability decrease with the advanced degradation, density tends to increase with the absorbed dose. For XPE samples this effect can be partially explained by the increase of crystallinity. It was tested by the DSC determination of the crystalline phase amount.

  14. 7 CFR Appendix A to Subpart C of... - Model Form of Loan Contract for Electric Distribution Borrowers

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... financial ratios: (i) TIER of 1.25; (ii) Operating TIER of 1.1; (iii) DSC of 1.25; and Operating DSC of 1.1... Coverage Ratios Requirements. Section 5.5. Depreciation Rates. Section 5.6. Property Maintenance. Section 5.7. Financial Books. Section 5.8. Rights of Inspection. Section 5.9. Area Coverage. Section 5.10...

  15. 7 CFR Appendix A to Subpart C of... - Model Form of Loan Contract for Electric Distribution Borrowers

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... financial ratios: (i) TIER of 1.25; (ii) Operating TIER of 1.1; (iii) DSC of 1.25; and Operating DSC of 1.1... Coverage Ratios Requirements. Section 5.5. Depreciation Rates. Section 5.6. Property Maintenance. Section 5.7. Financial Books. Section 5.8. Rights of Inspection. Section 5.9. Area Coverage. Section 5.10...

  16. 7 CFR Appendix A to Subpart C of... - Model Form of Loan Contract for Electric Distribution Borrowers

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... financial ratios: (i) TIER of 1.25; (ii) Operating TIER of 1.1; (iii) DSC of 1.25; and Operating DSC of 1.1... Coverage Ratios Requirements. Section 5.5. Depreciation Rates. Section 5.6. Property Maintenance. Section 5.7. Financial Books. Section 5.8. Rights of Inspection. Section 5.9. Area Coverage. Section 5.10...

  17. 7 CFR Appendix A to Subpart C of... - Model Form of Loan Contract for Electric Distribution Borrowers

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... financial ratios: (i) TIER of 1.25; (ii) Operating TIER of 1.1; (iii) DSC of 1.25; and Operating DSC of 1.1... Coverage Ratios Requirements. Section 5.5. Depreciation Rates. Section 5.6. Property Maintenance. Section 5.7. Financial Books. Section 5.8. Rights of Inspection. Section 5.9. Area Coverage. Section 5.10...

  18. Characterization of cure kinetics and physical properties of a high performance, glass fiber-reinforced epoxy prepreg and a novel fluorine-modified, amine-cured commercial epoxy

    NASA Astrophysics Data System (ADS)

    Bilyeu, Bryan

    Kinetic equation parameters for the curing reaction of a commercial glass fiber reinforced high performance epoxy prepreg composed of the tetrafunctional epoxy tetraglycidyl 4,4-diaminodiphenyl methane (TGDDM), the tetrafunctional amine curing agent 4,4'-diaminodiphenylsulfone (DDS) and an ionic initiator/accelerator, are determined by various thermal analysis techniques and the results compared. The reaction is monitored by heat generated determined by differential scanning calorimetry (DSC) and by high speed DSC when the reaction rate is high. The changes in physical properties indicating increasing conversion are followed by shifts in glass transition temperature determined by DSC, temperature-modulated DSC (TMDSC), step scan DSC and high speed DSC, thermomechanical (TMA) and dynamic mechanical (DMA) analysis and thermally stimulated depolarization (TSD). Changes in viscosity, also indicative of degree of conversion, are monitored by DMA. Thermal stability as a function of degree of cure is monitored by thermogravimetric analysis (TGA). The parameters of the general kinetic equations, including activation energy and rate constant, are explained and used to compare results of various techniques. The utilities of the kinetic descriptions are demonstrated in the construction of a useful time-temperature-transformation (TTT) diagram and a continuous heating transformation (CHT) diagram for rapid determination of processing parameters in the processing of prepregs. Shrinkage due to both resin consolidation and fiber rearrangement is measured as the linear expansion of the piston on a quartz dilatometry cell using TMA. The shrinkage of prepregs was determined to depend on the curing temperature, pressure applied and the fiber orientation. Chemical modification of an epoxy was done by mixing a fluorinated aromatic amine (aniline) with a standard aliphatic amine as a curing agent for a commercial Diglycidylether of Bisphenol-A (DGEBA) epoxy. The resulting cured network was tested for wear resistance using tribological techniques. Of the six anilines, 3-fluoroaniline and 4-fluoroaniline were determined to have lower wear than the unmodified epoxy, while the others showed much higher wear rates.

  19. Maximum aposteriori joint source/channel coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Gibson, Jerry D.

    1991-01-01

    A maximum aposteriori probability (MAP) approach to joint source/channel coder design is presented in this paper. This method attempts to explore a technique for designing joint source/channel codes, rather than ways of distributing bits between source coders and channel coders. For a nonideal source coder, MAP arguments are used to design a decoder which takes advantage of redundancy in the source coder output to perform error correction. Once the decoder is obtained, it is analyzed with the purpose of obtaining 'desirable properties' of the channel input sequence for improving overall system performance. Finally, an encoder design which incorporates these properties is proposed.

  20. 15 CFR 740.7 - Computers (APP).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... programmability. (ii) Technology and source code. Technology and source code eligible for License Exception APP..., reexports and transfers (in-country) for nuclear, chemical, biological, or missile end-users and end-uses...

  1. Spread Spectrum Visual Sensor Network Resource Management Using an End-to-End Cross-Layer Design

    DTIC Science & Technology

    2011-02-01

    Coding In this work, we use rate compatible punctured convolutional (RCPC) codes for channel coding [11]. Using RCPC codes al- lows us to utilize Viterbi’s...11] J. Hagenauer, “ Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE Trans. Commun., vol. 36, no. 4, pp. 389...source coding rate , a channel coding rate , and a power level to all nodes in the

  2. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1997-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  3. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  4. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  5. Power optimization of wireless media systems with space-time block codes.

    PubMed

    Yousefi'zadeh, Homayoun; Jafarkhani, Hamid; Moshfeghi, Mehran

    2004-07-01

    We present analytical and numerical solutions to the problem of power control in wireless media systems with multiple antennas. We formulate a set of optimization problems aimed at minimizing total power consumption of wireless media systems subject to a given level of QoS and an available bit rate. Our formulation takes into consideration the power consumption related to source coding, channel coding, and transmission of multiple-transmit antennas. In our study, we consider Gauss-Markov and video source models, Rayleigh fading channels along with the Bernoulli/Gilbert-Elliott loss models, and space-time block codes.

  6. Top ten reasons to register your code with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Berriman, G. Bruce; Mink, Jessica D.; Nemiroff, Robert J.; Robitaille, Thomas; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Teuben, Peter J.; Wallin, John F.; Warmels, Rein

    2017-01-01

    With 1,400 codes, the Astrophysics Source Code Library (ASCL, ascl.net) is the largest indexed resource for codes used in astronomy research in existence. This free online registry was established in 1999, is indexed by Web of Science and ADS, and is citable, with citations to its entries tracked by ADS. Registering your code with the ASCL is easy with our online submissions system. Making your software available for examination shows confidence in your research and makes your research more transparent, reproducible, and falsifiable. ASCL registration allows your software to be cited on its own merits and provides a citation that is trackable and accepted by all astronomy journals and journals such as Science and Nature. Registration also allows others to find your code more easily. This presentation covers the benefits of registering astronomy research software with the ASCL.

  7. Soil organic matter composition from correlated thermal analysis and nuclear magnetic resonance data in Australian national inventory of agricultural soils

    NASA Astrophysics Data System (ADS)

    Moore, T. S.; Sanderman, J.; Baldock, J.; Plante, A. F.

    2016-12-01

    National-scale inventories typically include soil organic carbon (SOC) content, but not chemical composition or biogeochemical stability. Australia's Soil Carbon Research Programme (SCaRP) represents a national inventory of SOC content and composition in agricultural systems. The program used physical fractionation followed by 13C nuclear magnetic resonance (NMR) spectroscopy. While these techniques are highly effective, they are typically too expensive and time consuming for use in large-scale SOC monitoring. We seek to understand if analytical thermal analysis is a viable alternative. Coupled differential scanning calorimetry (DSC) and evolved gas analysis (CO2- and H2O-EGA) yields valuable data on SOC composition and stability via ramped combustion. The technique requires little training to use, and does not require fractionation or other sample pre-treatment. We analyzed 300 agricultural samples collected by SCaRP, divided into four fractions: whole soil, coarse particulates (POM), untreated mineral associated (HUM), and hydrofluoric acid (HF)-treated HUM. All samples were analyzed by DSC-EGA, but only the POM and HF-HUM fractions were analyzed by NMR. Multivariate statistical analyses were used to explore natural clustering in SOC composition and stability based on DSC-EGA data. A partial least-squares regression (PLSR) model was used to explore correlations among the NMR and DSC-EGA data. Correlations demonstrated regions of combustion attributable to specific functional groups, which may relate to SOC stability. We are increasingly challenged with developing an efficient technique to assess SOC composition and stability at large spatial and temporal scales. Correlations between NMR and DSC-EGA may demonstrate the viability of using thermal analysis in lieu of more demanding methods in future large-scale surveys, and may provide data that goes beyond chemical composition to better approach quantification of biogeochemical stability.

  8. Research on choleretic effect of menthol, menthone, pluegone, isomenthone, and limonene in DanShu capsule.

    PubMed

    Hu, Guanying; Yuan, Xing; Zhang, Sanyin; Wang, Ruru; Yang, Miao; Wu, Chunjie; Wu, Zhigang; Ke, Xiao

    2015-02-01

    Danshu capsule (DSC) is a medicinal compound in traditional Chinese medicine (TCM). It is commonly used for the treatment of acute & chronic cholecystitis as well as choleithiasis. To study its choleretic effect, healthy rats were randomly divided into DSC high (DSCH, 900mg/kg), medium (DSCM, 450mg/kg), and low (DSCL, 225mg/kg) group, Xiaoyan Lidan tablet (XYLDT, 750mg/kg), and saline group. The bile was collected for 1h after 20-minute stabilization as the base level, and at 1h, 2h, 3h, and 4h after drug administration, respectively. Bile volume, total cholesterol, and total bile acid were measured at each time point. The results revealed that DSC significantly stimulated bile secretion, decreased total cholesterol level and increased total bile acid level. Therefore, it had choleretic effects. To identify the active components contributing to its choleretic effects, five major constituents which are menthol (39.33mg/kg), menthone (18.02mg/kg), isomenthone (8.18mg/kg), pluegone (3.31mg/kg), and limonene (4.39mg/kg) were tested on our rat model. The results showed that menthol and limonene could promote bile secretion when compared to DSC treatment (p > 0.05); Menthol, menthol and limonene could significantly decrease total cholesterol level (p<0.05 or p<0.01) as well as increase total bile acid level (p<0.05 or p<0.01); Isomenthone, as a isomer of menthone, existed slightly choleretic effects; Pluegone had no obvious role in bile acid efflux. These findings indicated that the choleretic effects of DSC may be attributed mainly to its three major constituents: menthol, menthone and limonene. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  9. Towards quantitative imaging: stability of fully automated nodule segmentation across varied dose levels and reconstruction parameters in a low-dose CT screening patient cohort

    NASA Astrophysics Data System (ADS)

    Wahi-Anwar, M. Wasil; Emaminejad, Nastaran; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael F.

    2018-02-01

    Quantitative imaging in lung cancer CT seeks to characterize nodules through quantitative features, usually from a region of interest delineating the nodule. The segmentation, however, can vary depending on segmentation approach and image quality, which can affect the extracted feature values. In this study, we utilize a fully-automated nodule segmentation method - to avoid reader-influenced inconsistencies - to explore the effects of varied dose levels and reconstruction parameters on segmentation. Raw projection CT images from a low-dose screening patient cohort (N=59) were reconstructed at multiple dose levels (100%, 50%, 25%, 10%), two slice thicknesses (1.0mm, 0.6mm), and a medium kernel. Fully-automated nodule detection and segmentation was then applied, from which 12 nodules were selected. Dice similarity coefficient (DSC) was used to assess the similarity of the segmentation ROIs of the same nodule across different reconstruction and dose conditions. Nodules at 1.0mm slice thickness and dose levels of 25% and 50% resulted in DSC values greater than 0.85 when compared to 100% dose, with lower dose leading to a lower average and wider spread of DSC values. At 0.6mm, the increased bias and wider spread of DSC values from lowering dose were more pronounced. The effects of dose reduction on DSC for CAD-segmented nodules were similar in magnitude to reducing the slice thickness from 1.0mm to 0.6mm. In conclusion, variation of dose and slice thickness can result in very different segmentations because of noise and image quality. However, there exists some stability in segmentation overlap, as even at 1mm, an image with 25% of the lowdose scan still results in segmentations similar to that seen in a full-dose scan.

  10. Impact on carbon footprint: a life cycle assessment of disposable versus reusable sharps containers in a large US hospital.

    PubMed

    Grimmond, Terry; Reiner, Sandra

    2012-06-01

    Hospitals are striving to reduce their greenhouse gas (GHG) emissions. Targeting supply chain points and replacing disposable with reusable items are among recommendations to achieve this. Annually, US hospitals use 35 million disposable (DSC) or reusable sharps containers (RSC) generating GHG in their manufacture, use, and disposal. Using a life cycle assessment we assessed the global warming potential (GWP) of both systems at a large US hospital which replaced DSC with RSC. GHG emissions (CO(2), CH(4), N(2)O) were calculated in metric tons of CO(2) equivalents (MTCO(2)eq). Primary energy input data was used wherever possible and region-specific conversions used to calculate the GWP of each activity. Unit process GHGs were collated into manufacture, transport, washing, and treatment and disposal. The DSC were not recycled nor had recycled content. Chemotherapy DSC were used in both systems. Emission totals were workload-normalized per 100 occupied beds-yr and rate ratio analyzed using Fisher's test with P ≤0.05 and 95% confidence level. With RSC, the hospital reduced its annual GWP by 127 MTCO(2)eq (-83.5%) and diverted 30.9 tons of plastic and 5.0 tons of cardboard from landfill. Using RSC reduced the number of containers manufactured from 34,396 DSC annually to 1844 RSC in year one only. The study indicates sharps containment GWP in US hospitals totals 100,000 MTCO(2)eq and if RSC were used nationally the figure could fall by 64,000 MTCO(2)eq which, whilst only a fraction of total hospital GWP, is a positive, sustainable step.

  11. IDH mutant and 1p/19q co-deleted oligodendrogliomas: tumor grade stratification using diffusion-, susceptibility-, and perfusion-weighted MRI.

    PubMed

    Lin, Yu; Xing, Zhen; She, Dejun; Yang, Xiefeng; Zheng, Yingyan; Xiao, Zebin; Wang, Xingfu; Cao, Dairong

    2017-06-01

    Currently, isocitrate dehydrogenase (IDH) mutation and 1p/19q co-deletion are proven diagnostic biomarkers for both grade II and III oligodendrogliomas (ODs). Non-invasive diffusion-weighted imaging (DWI), susceptibility-weighted imaging (SWI), and dynamic susceptibility contrast perfusion-weighted imaging (DSC-PWI) are widely used to provide physiological information (cellularity, hemorrhage, calcifications, and angiogenesis) of neoplastic histology and tumor grade. However, it is unclear whether DWI, SWI, and DSC-PWI are able to stratify grades of IDH-mutant and 1p/19q co-deleted ODs. We retrospectively reviewed the conventional MRI (cMRI), DWI, SWI, and DSC-PWI obtained on 33 patients with IDH-mutated and 1p/19q co-deleted ODs. Features of cMRI, normalized ADC (nADC), intratumoral susceptibility signals (ITSSs), normalized maxim CBV (nCBV), and normalized maximum CBF (nCBF) were compared between low-grade ODs (LGOs) and high-grade ODs (HGOs). Receiver operating characteristic curve and logistic regression were applied to determine diagnostic performances. HGOs tended to present with prominent edema and enhancement. nADC, ITSSs, nCBV, and nCBF were significantly different between groups (all P < 0.05). The combination of SWI and DSC-PWI for grading resulted in sensitivity and specificity of 100.00 and 93.33%, respectively. IDH-mutant and 1p/19q co-deleted ODs can be stratified by grades using cMRI and advanced magnetic resonance imaging techniques including DWI, SWI, and DSC-PWI. Combined ITSSs with nCBV appear to be a promising option for grading molecularly defined ODs in clinical practice.

  12. Delayed Sternal Closure in Infant Heart Surgery-The Importance of Where and When: An Analysis of the STS Congenital Heart Surgery Database.

    PubMed

    Nelson-McMillan, Kristen; Hornik, Christoph P; He, Xia; Vricella, Luca A; Jacobs, Jeffrey P; Hill, Kevin D; Pasquali, Sara K; Alejo, Diane E; Cameron, Duke E; Jacobs, Marshall L

    2016-11-01

    Delayed sternal closure (DSC) is commonly used to optimize hemodynamic stability after neonatal and infant heart surgery. We hypothesized that duration of sternum left open (SLO) was associated with rate of infection complications, and that location of sternal closure may mitigate infection risk. Infants (age ≤365 days) undergoing index operations with cardiopulmonary bypass and DSC at STS Congenital Heart Surgery Database centers (from 2007 to 2013) with adequate data quality were included. Primary outcome was occurrence of infection complication, defined as one or more of the following: endocarditis, pneumonia, wound infection, wound dehiscence, sepsis, or mediastinitis. Multivariable regression models were fit to assess association of infection complication with: duration of SLO (days), location of DSC procedure (operating room versus elsewhere), and patient and procedural factors. Of 6,127 index operations with SLO at 100 centers, median age and weight were 8 days (IQR, 5-24) and 3.3 kg (IQR, 2.9-3.8); 66% of operations were STAT morbidity category 4 or 5. At least one infection complication occurred in 18.7%, compared with 6.6% among potentially eligible neonates and infants without SLO. Duration of SLO (median, 3 days; IQR, 2-5) was associated with an increased rate of infection complications (p < 0.001). Location of DSC procedure was operating room (16%), intensive care unit (67%), or other (17%). Location of DSC was not associated with rate of infection complications (p = 0.45). Rate of occurrence of infectious complications is high among infants with sternum left open following cardiac surgery. Longer duration of SLO is associated with increased infection complications. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Longitudinal DSC-MRI for Distinguishing Tumor Recurrence From Pseudoprogression in Patients With a High-grade Glioma.

    PubMed

    Boxerman, Jerrold L; Ellingson, Benjamin M; Jeyapalan, Suriya; Elinzano, Heinrich; Harris, Robert J; Rogg, Jeffrey M; Pope, Whitney B; Safran, Howard

    2017-06-01

    For patients with high-grade glioma on clinical trials it is important to accurately assess time of disease progression. However, differentiation between pseudoprogression (PsP) and progressive disease (PD) is unreliable with standard magnetic resonance imaging (MRI) techniques. Dynamic susceptibility contrast perfusion MRI (DSC-MRI) can measure relative cerebral blood volume (rCBV) and may help distinguish PsP from PD. A subset of patients with high-grade glioma on a phase II clinical trial with temozolomide, paclitaxel poliglumex, and concurrent radiation were assessed. Nine patients (3 grade III, 6 grade IV), with a total of 19 enhancing lesions demonstrating progressive enhancement (≥25% increase from nadir) on postchemoradiation conventional contrast-enhanced MRI, had serial DSC-MRI. Mean leakage-corrected rCBV within enhancing lesions was computed for all postchemoradiation time points. Of the 19 progressively enhancing lesions, 10 were classified as PsP and 9 as PD by biopsy/surgery or serial enhancement patterns during interval follow-up MRI. Mean rCBV at initial progressive enhancement did not differ significantly between PsP and PD (2.35 vs. 2.17; P=0.67). However, change in rCBV at first subsequent follow-up (-0.84 vs. 0.84; P=0.001) and the overall linear trend in rCBV after initial progressive enhancement (negative vs. positive slope; P=0.04) differed significantly between PsP and PD. Longitudinal trends in rCBV may be more useful than absolute rCBV in distinguishing PsP from PD in chemoradiation-treated high-grade gliomas with DSC-MRI. Further studies of DSC-MRI in high-grade glioma as a potential technique for distinguishing PsP from PD are indicated.

  14. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  15. Some practical universal noiseless coding techniques

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1979-01-01

    Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.

  16. Dye-sensitized solar cells and complexes between pyridines and iodines. A NMR, IR and DFT study.

    PubMed

    Hansen, Poul Erik; Nguyen, Phuong Tuyet; Krake, Jacob; Spanget-Larsen, Jens; Lund, Torben

    2012-12-01

    Interactions between triiodide (I(3)(-)) and 4-tert-butylpyridine (4TBP) as postulated in dye-sensitized solar cells (DSC) are investigated by means of (13)C NMR and IR spectroscopy supported by DFT calculations. The charge transfer (CT) complex 4TBP·I(2) and potential salts such as (4TBP)(2)I(+), I(3)(-) were synthesized and characterized by IR and (13)C NMR spectroscopy. However, mixing (butyl)(4)N(+), I(3)(-) and 4TBP at concentrations comparable to those of the DSC solar cell did not lead to any reaction. Neither CT complexes nor cationic species like (4TBP)(2)I(+) were observed, judging from the (13)C NMR spectroscopic evidence. This questions the previously proposed formation of (4TBP)(2)I(+) in DSC cells. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Thermal behaviour and microanalysis of coal subbituminus

    NASA Astrophysics Data System (ADS)

    Heriyanti; Prendika, W.; Ashyar, R.; Sutrisno

    2018-04-01

    Differential scanning calorimetry (DSC) and X-ray powder diffraction (XRD) is used to study the thermal behaviour of sub-bituminous coal. The DSC experiment was performed in air atmosphere up to 125 °C at a heating rate of 25 °C min1. The DSC curve showed that the distinct transitional stages in the coal samples studied. Thermal heating temperature intervals, peak and dissociation energy of the coal samples were also determined. The XRD analysis was used to evaluate the diffraction pattern and crystal structure of the compounds in the coal sample at various temperatures (25-350 °C). The XRD analysis of various temperatures obtained compounds from the coal sample, dominated by quartz (SiO2) and corundum (Al2O3). The increase in temperature of the thermal treatment showed a better crystal formation.

  18. Characterization of the Polycaprolactone Melt Crystallization: Complementary Optical Microscopy, DSC, and AFM Studies

    PubMed Central

    Speranza, V.; Sorrentino, A.; De Santis, F.; Pantani, R.

    2014-01-01

    The first stages of the crystallization of polycaprolactone (PCL) were studied using several techniques. The crystallization exotherms measured by differential scanning calorimetry (DSC) were analyzed and compared with results obtained by polarized optical microscopy (POM), rheology, and atomic force microscope (AFM). The experimental results suggest a strong influence of the observation scale. In particular, the AFM, even if limited on time scale, appears to be the most sensitive technique to detect the first stages of crystallization. On the contrary, at least in the case analysed in this work, rheology appears to be the least sensitive technique. DSC and POM provide closer results. This suggests that the definition of induction time in the polymer crystallization is a vague concept that, in any case, requires the definition of the technique used for its characterization. PMID:24523644

  19. Characterization of the polycaprolactone melt crystallization: complementary optical microscopy, DSC, and AFM studies.

    PubMed

    Speranza, V; Sorrentino, A; De Santis, F; Pantani, R

    2014-01-01

    The first stages of the crystallization of polycaprolactone (PCL) were studied using several techniques. The crystallization exotherms measured by differential scanning calorimetry (DSC) were analyzed and compared with results obtained by polarized optical microscopy (POM), rheology, and atomic force microscope (AFM). The experimental results suggest a strong influence of the observation scale. In particular, the AFM, even if limited on time scale, appears to be the most sensitive technique to detect the first stages of crystallization. On the contrary, at least in the case analysed in this work, rheology appears to be the least sensitive technique. DSC and POM provide closer results. This suggests that the definition of induction time in the polymer crystallization is a vague concept that, in any case, requires the definition of the technique used for its characterization.

  20. Differential Scanning Calorimetry and Evolved Gas Analysis at Mars Ambient Conditions Using the Thermal Evolved Gas Analyzer (TEGA)

    NASA Technical Reports Server (NTRS)

    Musselwhite, D. S.; Boynton, W. V.; Ming, Douglas W.; Quadlander, G.; Kerry, K. E.; Bode, R. C.; Bailey, S. H.; Ward, M. G.; Pathare, A. V.; Lorenz, R. D.

    2000-01-01

    Differential Scanning Calorimetry (DSC) combined with evolved gas analysis (EGA) is a well developed technique for the analysis of a wide variety of sample types with broad application in material and soil sciences. However, the use of the technique for samples under conditions of pressure and temperature as found on other planets is one of current C development and cutting edge research. The Thermal Evolved Gas Analyzer (MGA), which was designed, built and tested at the University of Arizona's Lunar and Planetary Lab (LPL), utilizes DSC/EGA. TEGA, which was sent to Mars on the ill-fated Mars Polar Lander, was to be the first application of DSC/EGA on the surface of Mars as well as the first direct measurement of the volatile-bearing mineralogy in martian soil.

  1. 15 CFR 740.7 - Computers (APP).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... License Exception. (2) Access and release restrictions. (i)[Reserved] (ii) Technology and source code. Technology and source code eligible for License Exception APP may not be released to nationals of Cuba, Iran...

  2. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  3. Evaluation of Diagnostic Codes in Morbidity and Mortality Data Sources for Heat-Related Illness Surveillance

    PubMed Central

    Watkins, Sharon

    2017-01-01

    Objectives: The primary objective of this study was to identify patients with heat-related illness (HRI) using codes for heat-related injury diagnosis and external cause of injury in 3 administrative data sets: emergency department (ED) visit records, hospital discharge records, and death certificates. Methods: We obtained data on ED visits, hospitalizations, and deaths for Florida residents for May 1 through October 31, 2005-2012. To identify patients with HRI, we used codes from the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to search data on ED visits and hospitalizations and codes from the International Classification of Diseases, Tenth Revision (ICD-10) to search data on deaths. We stratified the results by data source and whether the HRI was work related. Results: We identified 23 981 ED visits, 4816 hospitalizations, and 140 deaths in patients with non–work-related HRI and 2979 ED visits, 415 hospitalizations, and 23 deaths in patients with work-related HRI. The most common diagnosis codes among patients were for severe HRI (heat exhaustion or heatstroke). The proportion of patients with a severe HRI diagnosis increased with data source severity. If ICD-9-CM code E900.1 and ICD-10 code W92 (excessive heat of man-made origin) were used as exclusion criteria for HRI, 5.0% of patients with non–work-related deaths, 3.0% of patients with work-related ED visits, and 1.7% of patients with work-related hospitalizations would have been removed. Conclusions: Using multiple data sources and all diagnosis fields may improve the sensitivity of HRI surveillance. Future studies should evaluate the impact of converting ICD-9-CM to ICD-10-CM codes on HRI surveillance of ED visits and hospitalizations. PMID:28379784

  4. Enhancements to the MCNP6 background source

    DOE PAGES

    McMath, Garrett E.; McKinney, Gregg W.

    2015-10-19

    The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less

  5. Synthesis of Hydrophobic, Crosslinkable Resins.

    DTIC Science & Technology

    1985-12-01

    product by methanol precipitation the majority of the first oligomer was L-"- lost. 4.14 DIFFERENTIAL SCANNING CALORIMETRY. The DSC trace of a typical...polymer from the DSC traces obtained to dcte. Preliminary studies using an automated torsional pendulum indicate that the Tg of the crosslinked polymer is...enabling water to be used in the purification steps. The diethyl phosphonates are readily prepared by heating triethyl phosphite with the chloromethyl

  6. Raman scattering boson peak and differential scanning calorimetry studies of the glass transition in tellurium-zinc oxide glasses.

    PubMed

    Stavrou, E; Tsiantos, C; Tsopouridou, R D; Kripotou, S; Kontos, A G; Raptis, C; Capoen, B; Bouazaoui, M; Turrell, S; Khatir, S

    2010-05-19

    Raman scattering and differential scanning calorimetry (DSC) measurements have been carried out on four mixed tellurium-zinc oxide (TeO(2))(1 - x)(ZnO)(x) (x = 0.1, 0.2, 0.3, 0.4) glasses under variable temperature, with particular attention being given to the respective glass transition region. From the DSC measurements, the glass transition temperature T(g) has been determined for each glass, showing a monotonous decrease of T(g) with increasing ZnO content. The Raman study is focused on the low-frequency band of the glasses, the so-called boson peak (BP), whose frequency undergoes an abrupt decrease at a temperature T(d) very close to the respective T(g) values obtained by DSC. These results show that the BP is highly sensitive to dynamical effects over the glass transition and provides a means for an equally reliable (to DSC) determination of T(g) in tellurite glasses and other network glasses. The discontinuous temperature dependence of the BP frequency at the glass transition, along with the absence of such a behaviour by the high-frequency Raman bands (due to local atomic vibrations), indicates that marked changes of the medium range order (MRO) occur at T(g) and confirms the correlation between the BP and the MRO of glasses.

  7. A theoretical framework to model DSC-MRI data acquired in the presence of contrast agent extravasation

    NASA Astrophysics Data System (ADS)

    Quarles, C. C.; Gochberg, D. F.; Gore, J. C.; Yankeelov, T. E.

    2009-10-01

    Dynamic susceptibility contrast (DSC) MRI methods rely on compartmentalization of the contrast agent such that a susceptibility gradient can be induced between the contrast-containing compartment and adjacent spaces, such as between intravascular and extravascular spaces. When there is a disruption of the blood-brain barrier, as is frequently the case with brain tumors, a contrast agent leaks out of the vasculature, resulting in additional T1, T2 and T*2 relaxation effects in the extravascular space, thereby affecting the signal intensity time course and reducing the reliability of the computed hemodynamic parameters. In this study, a theoretical model describing these dynamic intra- and extravascular T1, T2 and T*2 relaxation interactions is proposed. The applicability of using the proposed model to investigate the influence of relevant MRI pulse sequences (e.g. echo time, flip angle), and physical (e.g. susceptibility calibration factors, pre-contrast relaxation rates) and physiological parameters (e.g. permeability, blood flow, compartmental volume fractions) on DSC-MRI signal time curves is demonstrated. Such a model could yield important insights into the biophysical basis of contrast-agent-extravasastion-induced effects on measured DSC-MRI signals and provide a means to investigate pulse sequence optimization and appropriate data analysis methods for the extraction of physiologically relevant imaging metrics.

  8. Deconvolution of complex differential scanning calorimetry profiles for protein transitions under kinetic control.

    PubMed

    Toledo-Núñez, Citlali; Vera-Robles, L Iraís; Arroyo-Maya, Izlia J; Hernández-Arana, Andrés

    2016-09-15

    A frequent outcome in differential scanning calorimetry (DSC) experiments carried out with large proteins is the irreversibility of the observed endothermic effects. In these cases, DSC profiles are analyzed according to methods developed for temperature-induced denaturation transitions occurring under kinetic control. In the one-step irreversible model (native → denatured) the characteristics of the observed single-peaked endotherm depend on the denaturation enthalpy and the temperature dependence of the reaction rate constant, k. Several procedures have been devised to obtain the parameters that determine the variation of k with temperature. Here, we have elaborated on one of these procedures in order to analyze more complex DSC profiles. Synthetic data for a heat capacity curve were generated according to a model with two sequential reactions; the temperature dependence of each of the two rate constants involved was determined, according to the Eyring's equation, by two fixed parameters. It was then shown that our deconvolution procedure, by making use of heat capacity data alone, permits to extract the parameter values that were initially used. Finally, experimental DSC traces showing two and three maxima were analyzed and reproduced with relative success according to two- and four-step sequential models. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Toward a Psychology of Social Change: A Typology of Social Change

    PubMed Central

    de la Sablonnière, Roxane

    2017-01-01

    Millions of people worldwide are affected by dramatic social change (DSC). While sociological theory aims to understand its precipitants, the psychological consequences remain poorly understood. A large-scale literature review pointed to the desperate need for a typology of social change that might guide theory and research toward a better understanding of the psychology of social change. Over 5,000 abstracts from peer-reviewed articles were assessed from sociological and psychological publications. Based on stringent inclusion criteria, a final 325 articles were used to construct a novel, multi-level typology designed to conceptualize and categorize social change in terms of its psychological threat to psychological well-being. The typology of social change includes four social contexts: Stability, Inertia, Incremental Social Change and, finally, DSC. Four characteristics of DSC were further identified: the pace of social change, rupture to the social structure, rupture to the normative structure, and the level of threat to one's cultural identity. A theoretical model that links the characteristics of social change together and with the social contexts is also suggested. The typology of social change as well as our theoretical proposition may serve as a foundation for future investigations and increase our understanding of the psychologically adaptive mechanisms used in the wake of DSC. PMID:28400739

  10. Heat resistance of viable but non-culturable Escherichia coli cells determined by differential scanning calorimetry.

    PubMed

    Castro-Rosas, Javier; Gómez-Aldapa, Carlos Alberto; Villagómez Ibarra, José Roberto; Santos-López, Eva María; Rangel-Vargas, Esmeralda

    2017-10-16

    Several reports have suggested that the viable but non-culturable (VBNC) state is a resistant form of bacterial cells that allows them to remain in a dormant form in the environment. Nevertheless, studies on the resistance of VBNC bacterial cells to ecological factors are limited, mainly because techniques that allow this type of evaluation are lacking. Differential scanning calorimetry (DSC) has been used to study the thermal resistance of culturable bacteria but has never been used to study VBNC cells. In this work, the heat resistance of Escherichia coli cells in the VBNC state was studied using the DSC technique. The VBNC state was induced in E. coli ATCC 25922 by suspending bacterial cells in artificial sea water, followed by storage at 3 ± 2°C for 110 days. Periodically, the behaviour of E. coli cells was monitored by plate counts, direct viable counts and DSC. The entire bacterial population entered the VBNC state after 110 days of storage. The results obtained with DSC suggest that the VBNC state does not confer thermal resistance to E. coli cells in the temperature range analysed here. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Toward a Psychology of Social Change: A Typology of Social Change.

    PubMed

    de la Sablonnière, Roxane

    2017-01-01

    Millions of people worldwide are affected by dramatic social change (DSC). While sociological theory aims to understand its precipitants, the psychological consequences remain poorly understood. A large-scale literature review pointed to the desperate need for a typology of social change that might guide theory and research toward a better understanding of the psychology of social change. Over 5,000 abstracts from peer-reviewed articles were assessed from sociological and psychological publications. Based on stringent inclusion criteria, a final 325 articles were used to construct a novel, multi-level typology designed to conceptualize and categorize social change in terms of its psychological threat to psychological well-being. The typology of social change includes four social contexts: Stability, Inertia, Incremental Social Change and, finally, DSC. Four characteristics of DSC were further identified: the pace of social change, rupture to the social structure, rupture to the normative structure, and the level of threat to one's cultural identity. A theoretical model that links the characteristics of social change together and with the social contexts is also suggested. The typology of social change as well as our theoretical proposition may serve as a foundation for future investigations and increase our understanding of the psychologically adaptive mechanisms used in the wake of DSC.

  12. On the optimality of code options for a universal noiseless coder

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner

    1991-01-01

    A universal noiseless coding structure was developed that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Custom VLSI coder and decoder modules capable of processing over 20 million samples per second are currently under development. The first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery, and they confirm the optimality of the scheme. On sources having Gaussian or Poisson distributions, coder performance is also projected through analysis and simulation.

  13. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  14. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  15. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  16. Automating RPM Creation from a Source Code Repository

    DTIC Science & Technology

    2012-02-01

    apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make

  17. Testing the Multispecimen Absolute Paleointensity Method with Archaeological Baked Clays and Bricks: New Data for Central Europe

    NASA Astrophysics Data System (ADS)

    Schnepp, Elisabeth; Leonhardt, Roman

    2014-05-01

    The domain-state corrected multiple-specimen paleointensity determination technique (MSP-DSC, Fabian & Leonhardt, EPSL 297, 84, 2010) has been tested for archaeological baked clays and bricks. The following procedure was applied: (1) Exclusion of secondary overprints using alternating field (AF) or thermal demagnetization and assignment of characteristic remanent magnetization (ChRM) direction. (2) Determination of magneto mineralogical alteration using anhysteretic remanent magnetization (ARM) or temperature dependence of susceptibility. (3) Measurement of ARM anisotropy tensor, calculation of the ancient magnetic field direction. (4) Sister specimens were subjected to the MSP-DSC technique aligned (anti-)parallel to the ancient magnetic field direction. (5) Several checks were applied in order to exclude data points from further evaluation: (a) The accuracy of orientation (< 10°), (b) absence of secondary components (< 10°), (c) use of a considerable NRM fraction (20 to 80%), (d) weak alteration (smaller than for domain state change) and finally (e) domain state correction was applied. Bricks and baked clays from archaeological sites with ages between 645 BC and 2003 AD have been subjected to MSP-DSC absolute paleointensity (PI) determination. Aims of study are to check precision and reliability of the method. The obtained PI values are compared with direct field observation, the IGRF, the GUFM1 or Thellier results. The Thellier experiments often show curved lines and pTRM checks fail for higher temperatures. Nevertheless in the low temperature range straight lines have been obtained but they provide scattered paleointensity values. Mean paleointensites have relative errors often exceeding 10%, which are not considered as high quality PI estimates. MSP-DSC experiments for the structures older than 300 years are still under progress. The paleointensities obtained from the MSP-DSC experiments for the young materials (after 1700 AD) have small relative errors of a few or even less than one per cent, although the data points are scattered in some cases. For these sites comparison with the historical field values shows very good agreement. Small deviations could be explained by the higher cooling rates used in the laboratory. These young structures were made of bricks and the unweathered baked clay of the 2003 experimental kiln was like brick, either. The sites provided much material so that tests were done to investigate the MSP-DSC methodology further. For example it was tested, if different NRM deblocking fractions have influence on the paleointensity estimate. It seems that use of fractions lower than 20% of the NRM can lead to an underestimation of PI. Although MSP-DSC experiments carried out on different blocks of the same structure can provide very similar results, the use of several fragments from at least five different units (potshards, bricks, in situ burnt blocks or rocks) of the same structure is recommended in or to obtain a reliable estimate of the experimental errors. Five data points may define already a well constraint straight line, but for a better precision 15 (< 2%) data points may be required. For the young structures the MSP-DSC method provided reliable PI estimates which have been included into the archaeointensity data base

  18. An adaptable binary entropy coder

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Klimesh, M.

    2001-01-01

    We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.

  19. Plasma separation process. Betacell (BCELL) code, user's manual

    NASA Astrophysics Data System (ADS)

    Taherzadeh, M.

    1987-11-01

    The emergence of clearly defined applications for (small or large) amounts of long-life and reliable power sources has given the design and production of betavoltaic systems a new life. Moreover, because of the availability of the Plasma Separation Program, (PSP) at TRW, it is now possible to separate the most desirable radioisotopes for betacell power generating devices. A computer code, named BCELL, has been developed to model the betavoltaic concept by utilizing the available up-to-date source/cell parameters. In this program, attempts have been made to determine the betacell energy device maximum efficiency, degradation due to the emitting source radiation and source/cell lifetime power reduction processes. Additionally, comparison is made between the Schottky and PN junction devices for betacell battery design purposes. Certain computer code runs have been made to determine the JV distribution function and the upper limit of the betacell generated power for specified energy sources. A Ni beta emitting radioisotope was used for the energy source and certain semiconductors were used for the converter subsystem of the betacell system. Some results for a Promethium source are also given here for comparison.

  20. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  1. Using a geographic information system and hillslope runoff modeling to support decision-making for managed aquifer recharge using distributed stormwater collection

    NASA Astrophysics Data System (ADS)

    Teo, E. K.; Beganskas, S.; Young, K. S.; Weir, W. B.; Harmon, R. E.; Lozano, S.; Fisher, A. T.

    2017-12-01

    Many aquifer systems in central coastal California face a triple threat of excess demand, changing land use, and a shifting climate. These last two factors can contribute to reductions in groundwater recharge. Managed aquifer recharge using distributed stormwater collection (DSC-MAR) is an adaptation technique for collecting excess stormwater runoff from hillslopes for infiltration into underlying aquifers, before that water reaches a "blue line" stream. We are developing a decision support system (DSS) that combines surface and subsurface hydrogeological data with high-resolution predictions of hillslope runoff, with specific application to Santa Cruz and northern Monterey Counties. Other studies presented at AGU will focus on the northern and southern parts of our study region (San Lorenzo River Basin, Lower Pajaro River Basin). This presentation focuses on mid-Santa Cruz County, including the Soquel-Aptos Groundwater Basin. The DSS uses a geographic information system to compile and merge data from numerous local, state, and federal sources to identify locations on the landscape where DSC-MAR may be most suitable. This requires classification of disparate data types so that they can be combined. Stormwater runoff for individual river basins in the study region was simulated using historical streamflow data for calibration and validation. Both analyses were completed with relatively fine resolution, from 10 m2 pixels for elevation to 0.1-1.0 km hydrologic response units for properties such as soil and vegetation properties. Future climate is uncertain, so we used historical data to create a catalog of dry, normal, and wet hydrologic conditions, then created synthetic future climate scenarios for simulation. The DDS shows that there are numerous regions in mid-Santa Cruz County where there is a confluence of MAR suitability and the generation of stormwater runoff that could supply recharge projects (with a nominal target of 100 ac-ft/yr of infiltration), even under dry climate scenarios, and allows us to assess the potential benefits to be derived from a implementation of DSC-MAR projects in strategic locations. The tools and methods developed with this DDS should be broadly applicable to other basins.

  2. PREFACE: INERA Workshop: Transition Metal Oxide Thin Films-functional Layers in "Smart windows" and Water Splitting Devices. Parallel session of the 18th International School on Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    2014-11-01

    The Special issue presents the papers for the INERA Workshop entitled "Transition Metal Oxides as Functional Layers in Smart windows and Water Splitting Devices", which was held in Varna, St. Konstantin and Elena, Bulgaria, from the 4th-6th September 2014. The Workshop is organized within the context of the INERA "Research and Innovation Capacity Strengthening of ISSP-BAS in Multifunctional Nanostructures", FP7 Project REGPOT 316309 program, European project of the Institute of Solid State Physics at the Bulgarian Academy of Sciences. There were 42 participants at the workshop, 16 from Sweden, Germany, Romania and Hungary, 11 invited lecturers, and 28 young participants. There were researchers present from prestigious European laboratories which are leaders in the field of transition metal oxide thin film technologies. The event contributed to training young researchers in innovative thin film technologies, as well as thin films characterization techniques. The topics of the Workshop cover the field of technology and investigation of thin oxide films as functional layers in "Smart windows" and "Water splitting" devices. The topics are related to the application of novel technologies for the preparation of transition metal oxide films and the modification of chromogenic properties towards the improvement of electrochromic and termochromic device parameters for possible industrial deployment. The Workshop addressed the following topics: Metal oxide films-functional layers in energy efficient devices; Photocatalysts and chemical sensing; Novel thin film technologies and applications; Methods of thin films characterizations; From the 37 abstracts sent, 21 manuscripts were written and later refereed. We appreciate the comments from all the referees, and we are grateful for their valuable contributions. Guest Editors: Assoc. Prof. Dr.Tatyana Ivanova Prof. DSc Kostadinka Gesheva Prof. DSc Hassan Chamatti Assoc. Prof. Dr. Georgi Popkirov Workshop Organizing Committee Prof.DSc Kostadinka Gesheva, Central Laboratory of Solar Energy and New Energy Sources, Bulgarian Academy of Sciences (CL SENES-BAS) - Chairperson Assoc. Prof. Dr Anna Szekeres - Institute of Solid State Physics- BAS Assoc. Prof Dr. Tatyana Ivanova - CL SENES -BAS Assist. Prof. Radostina Kamburova - ISSP-BAS

  3. Characterization of silicon carbide and diamond detectors for neutron applications

    NASA Astrophysics Data System (ADS)

    Hodgson, M.; Lohstroh, A.; Sellin, P.; Thomas, D.

    2017-10-01

    The presence of carbon atoms in silicon carbide and diamond makes these materials ideal candidates for direct fast neutron detectors. Furthermore the low atomic number, strong covalent bonds, high displacement energies, wide bandgap and low intrinsic carrier concentrations make these semiconductor detectors potentially suitable for applications where rugged, high-temperature, low-gamma-sensitivity detectors are required, such as active interrogation, electronic personal neutron dosimetry and harsh environment detectors. A thorough direct performance comparison of the detection capabilities of semi-insulating silicon carbide (SiC-SI), single crystal diamond (D-SC), polycrystalline diamond (D-PC) and a self-biased epitaxial silicon carbide (SiC-EP) detector has been conducted and benchmarked against a commercial silicon PIN (Si-PIN) diode, in a wide range of alpha (Am-241), beta (Sr/Y-90), ionizing photon (65 keV to 1332 keV) and neutron radiation fields (including 1.2 MeV to 16.5 MeV mono-energetic neutrons, as well as neutrons from AmBe and Cf-252 sources). All detectors were shown to be able to directly detect and distinguish both the different radiation types and energies by using a simple energy threshold discrimination method. The SiC devices demonstrated the best neutron energy discrimination ratio (E\\max (n=5 MeV)/E\\max (n=1 MeV)  ≈5), whereas a superior neutron/photon cross-sensitivity ratio was observed in the D-PC detector (E\\max (AmBe)/E\\max (Co-60)  ≈16). Further work also demonstrated that the cross-sensitivity ratios can be improved through use of a simple proton-recoil conversion layer. Stability issues were also observed in the D-SC, D-PC and SiC-SI detectors while under irradiation, namely a change of energy peak position and/or count rate with time (often referred to as the polarization effect). This phenomenon within the detectors was non-debilitating over the time period tested (> 5 h) and, as such, stable operation was possible. Furthermore, the D-SC, self-biased SiC-EP and semi-insulating SiC detectors were shown to operate over the temperature range -60 °C to +100 °C.

  4. Use of thermal analysis techniques (TG-DSC) for the characterization of diverse organic municipal waste streams to predict biological stability prior to land application.

    PubMed

    Fernández, José M; Plaza, César; Polo, Alfredo; Plante, Alain F

    2012-01-01

    The use of organic municipal wastes as soil amendments is an increasing practice that can divert significant amounts of waste from landfill, and provides a potential source of nutrients and organic matter to ameliorate degraded soils. Due to the high heterogeneity of organic municipal waste streams, it is difficult to rapidly and cost-effectively establish their suitability as soil amendments using a single method. Thermal analysis has been proposed as an evolving technique to assess the stability and composition of the organic matter present in these wastes. In this study, three different organic municipal waste streams (i.e., a municipal waste compost (MC), a composted sewage sludge (CS) and a thermally dried sewage sludge (TS)) were characterized using conventional and thermal methods. The conventional methods used to test organic matter stability included laboratory incubation with measurement of respired C, and spectroscopic methods to characterize chemical composition. Carbon mineralization was measured during a 90-day incubation, and samples before and after incubation were analyzed by chemical (elemental analysis) and spectroscopic (infrared and nuclear magnetic resonance) methods. Results were compared with those obtained by thermogravimetry (TG) and differential scanning calorimetry (DSC) techniques. Total amounts of CO(2) respired indicated that the organic matter in the TS was the least stable, while that in the CS was the most stable. This was confirmed by changes detected with the spectroscopic methods in the composition of the organic wastes due to C mineralization. Differences were especially pronounced for TS, which showed a remarkable loss of aliphatic and proteinaceous compounds during the incubation process. TG, and especially DSC analysis, clearly reflected these differences between the three organic wastes before and after the incubation. Furthermore, the calculated energy density, which represents the energy available per unit of organic matter, showed a strong correlation with cumulative respiration. Results obtained support the hypothesis of a potential link between the thermal and biological stability of the studied organic materials, and consequently the ability of thermal analysis to characterize the maturity of municipal organic wastes and composts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Coding For Compression Of Low-Entropy Data

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  6. Study of an External Neutron Source for an Accelerator-Driven System using the PHITS Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugawara, Takanori; Iwasaki, Tomohiko; Chiba, Takashi

    A code system for the Accelerator Driven System (ADS) has been under development for analyzing dynamic behaviors of a subcritical core coupled with an accelerator. This code system named DSE (Dynamics calculation code system for a Subcritical system with an External neutron source) consists of an accelerator part and a reactor part. The accelerator part employs a database, which is calculated by using PHITS, for investigating the effect related to the accelerator such as the changes of beam energy, beam diameter, void generation, and target level. This analysis method using the database may introduce some errors into dynamics calculations sincemore » the neutron source data derived from the database has some errors in fitting or interpolating procedures. In this study, the effects of various events are investigated to confirm that the method based on the database is appropriate.« less

  7. Correlation estimation and performance optimization for distributed image compression

    NASA Astrophysics Data System (ADS)

    He, Zhihai; Cao, Lei; Cheng, Hui

    2006-01-01

    Correlation estimation plays a critical role in resource allocation and rate control for distributed data compression. A Wyner-Ziv encoder for distributed image compression is often considered as a lossy source encoder followed by a lossless Slepian-Wolf encoder. The source encoder consists of spatial transform, quantization, and bit plane extraction. In this work, we find that Gray code, which has been extensively used in digital modulation, is able to significantly improve the correlation between the source data and its side information. Theoretically, we analyze the behavior of Gray code within the context of distributed image compression. Using this theoretical model, we are able to efficiently allocate the bit budget and determine the code rate of the Slepian-Wolf encoder. Our experimental results demonstrate that the Gray code, coupled with accurate correlation estimation and rate control, significantly improves the picture quality, by up to 4 dB, over the existing methods for distributed image compression.

  8. Thermal Stability of Otto Fuel Prepolymer

    NASA Technical Reports Server (NTRS)

    Tompa, Albert S.; Sandagger, Karrie H.; Bryant, William F., Jr.; McConnell, William T.; Lacot, Fernando; Carr, Walter A.

    2000-01-01

    Otto Fuel II contains a nitrate ester, plasticizer, and 2-NDPA as a stabilizer. Otto Fuel with stabilizers from three vendors was investigated by dynamic and isothermal DSC using samples sealed in a glass ampoule and by Isothermal Microcalorimetry (IMC) using 10 gram samples aged at 75 C for 35 days. DSC kinetics did not show differences between the stabilizer; the samples had an activation energy of 36.7 +/- 0.6 kcal/mol. However, IMC analysis was sensitive enough to detect small differences between the stabilizer, namely energy of interaction values of 7 to 14 Joules. DSC controlled cooling and heating at 5 C/min from 30 to -60 to 40 C experiments were similar and showed a crystallization peak at -48 +/- 1 C during cooling, and upon heating there was a glass transition temperature step at approx. -54 +/- 0.5 C and a melting peak at -28 +/- 0.4 C.

  9. Thermal Stability of Otto Fuel Prepolymer

    NASA Technical Reports Server (NTRS)

    Tompa, Albert S.; Sandagger, Karrie H.; Bryant, William F., Jr.; McConnell, William T.; Lacot, Fernando; Carr, Walter A.

    2000-01-01

    Otto Fuel II contains a nitrate ester, plasticizer, and 2-NPDA as a stabilizer. Otto Fuel with stabilizers from three vendors was investigated by dynamic and isothermal differential scanning calorimetry (DSC) using samples sealed in a glass ampoule and by Isothermal Microcalorimetry (IMC) using 10 gram samples aged at 75 C for 35 days. DSC kinetics did not show differences between the stabilizer; the samples had an activation energy of 36.7 +/- 0.6 kcal/mol. However, IMC analysis was sensitive enough to detect small differences between the stabilizer, namely energy of interaction values of 7 to 14 Joules. DSC controlled cooling and heating at 5 C/min from 30 to -60 to 40 C experiments were similar and showed a crystallization peak at -48 +/- 1 C during cooling, and upon heating there was a glass transition temperature step at approx. -54 +/- 0.5 C and a melting peak at -28 +/- 0.4 C.

  10. Differential Scanning Calorimetry and Evolved Gas Analysis at Mars Ambient Conditions Using the Thermal Evolved Gas Analyser (TEGA)

    NASA Technical Reports Server (NTRS)

    Musselwhite, D. S.; Boynton, W. V.; Ming, D. W.; Quadlander, G.; Kerry, K. E.; Bode, R. C.; Bailey, S. H.; Ward, M. G.; Pathare, A. V.; Lorenz, R. D.

    2000-01-01

    Differential Scanning Calorimetry (DSC) combined with evolved gas analysis (EGA) is a well developed technique for the analysis of a wide variety of sample types with broad application in material and soil sciences. However, the use of the technique for samples under conditions of pressure and temperature as found on other planets is one of current development and cutting edge research. The Thermal Evolved Gas Analyzer (TEGA), which was designed, built and tested at the University of Arizona's Lunar and Planetary Lab (LPL), utilizes DSC/EGA. TEGA, which was sent to Mars on the ill-fated Mars Polar Lander, was to be the first application of DSC/EGA on the surface of Mars as well as the first direct measurement of the volatile-bearing mineralogy in martian soil. Additional information is available in the original extended abstract.

  11. Thermal Stability and Kinetic Study of Fluvoxamine Stability in Binary Samples with Lactose.

    PubMed

    Ghaderi, Faranak; Nemati, Mahboob; Siahi-Shadbad, Mohammad Reza; Valizadeh, Hadi; Monajjemzadeh, Farnaz

    2017-04-01

    Purpose: In the present study the incompatibility of FLM (fluvoxamine) with lactose in solid state mixtures was investigated. The compatibility was evaluated using different physicochemical methods such as differential scanning calorimetry (DSC), Fourier-transform infrared (FTIR) spectroscopy and mass spectrometry. Methods: Non-Isothermally stressed physical mixtures were used to calculate the solid-state kinetic parameters. Different thermal models such as Friedman, Flynn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) were used for the characterization of the drug-excipient interaction. Results: Overall, the incompatibility of FLM with lactose as a reducing carbohydrate was successfully evaluated and the activation energy of this interaction was calculated. Conclusion: In this research the lactose and FLM Maillard interaction was proved using physicochemical techniques including DSC and FTIR. It was shown that DSC- based kinetic analysis provides fast and versatile kinetic comparison of Arrhenius activation energies for different pharmaceutical samples.

  12. Effect of critical molecular weight of PEO in epoxy/EPO blends as characterized by advanced DSC and solid-state NMR

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoliang; Lu, Shoudong; Sun, Pingchuan; Xue, Gi

    2013-03-01

    The differential scanning calorimetry (DSC) and solid state NMR have been used to systematically study the length scale of the miscibility and local dynamics of the epoxy resin/poly(ethylene oxide) (ER/PEO) blends with different PEO molecular weight. By DSC, we found that the diffusion behavior of PEO with different Mw is an important factor in controlling these behaviors upon curing. We further employed two-dimensional 13C-{1H}PISEMA NMR experiment to elucidate the possible weak interaction and detailed local dynamics in ER/PEO blends. The CH2O group of PEO forms hydrogen bond with hydroxyl proton of cured-ER ether group, and its local dynamics frozen by such interaction. Our finding indicates that molecular weight (Mw) of PEO is a crucial factor in controlling the miscibility, chain dynamics and hydrogen bonding interaction in these blends.

  13. Thermal Stability and Kinetic Study of Fluvoxamine Stability in Binary Samples with Lactose

    PubMed Central

    Ghaderi, Faranak; Nemati, Mahboob; Siahi-Shadbad, Mohammad Reza; Valizadeh, Hadi; Monajjemzadeh, Farnaz

    2017-01-01

    Purpose: In the present study the incompatibility of FLM (fluvoxamine) with lactose in solid state mixtures was investigated. The compatibility was evaluated using different physicochemical methods such as differential scanning calorimetry (DSC), Fourier-transform infrared (FTIR) spectroscopy and mass spectrometry. Methods: Non-Isothermally stressed physical mixtures were used to calculate the solid–state kinetic parameters. Different thermal models such as Friedman, Flynn–Wall–Ozawa (FWO) and Kissinger–Akahira–Sunose (KAS) were used for the characterization of the drug-excipient interaction. Results: Overall, the incompatibility of FLM with lactose as a reducing carbohydrate was successfully evaluated and the activation energy of this interaction was calculated. Conclusion: In this research the lactose and FLM Maillard interaction was proved using physicochemical techniques including DSC and FTIR. It was shown that DSC- based kinetic analysis provides fast and versatile kinetic comparison of Arrhenius activation energies for different pharmaceutical samples. PMID:28507936

  14. One-step simultaneous differential scanning calorimetry-FTIR microspectroscopy to quickly detect continuous pathways in the solid-state glucose/asparagine Maillard reaction.

    PubMed

    Hwang, Deng-Fwu; Hsieh, Tzu-Feng; Lin, Shan-Yang

    2013-01-01

    The stepwise reaction pathway of the solid-state Maillard reaction between glucose (Glc) and asparagine (Asn) was investigated using simultaneous differential scanning calorimetry (DSC)-FTIR microspectroscopy. The color change and FTIR spectra of Glc-Asn physical mixtures (molar ratio = 1:1) preheated to different temperatures followed by cooling were also examined. The successive reaction products such as Schiff base intermediate, Amadori product, and decarboxylated Amadori product in the solid-state Glc-Asn Maillard reaction were first simultaneously evidenced by this unique DSC-FTIR microspectroscopy. The color changed from white to yellow-brown to dark brown, and appearance of new IR peaks confirmed the formation of Maillard reaction products. The present study clearly indicates that this unique DSC-FTIR technique not only accelerates but also detects precursors and products of the Maillard reaction in real time.

  15. The characterization of copper alloys for the application of fusion reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishiyama, S.; Fukaya, K.; Eto, M.

    Three kinds of candidate copper alloys for divertor structural materials of fusion experimental reactors, that is, Oxygen Free High thermal conductivity Copper (OFHC), alumina disperse reinforced copper (DSC) and the composite of W and Cu (W/Cu), were prepared for strength and fatigue tests at temperatures ranging from R.T. to 500 C in a vacuum. High temperature strength of DSC and W/Cu with rapid fracture after peak loading at the temperatures is higher than that of OFHC by factor of 2, but fracture strains of DFC and W/Cu are smaller than that of OFHC. Fatigue life of DSC, which shows themore » same fatigue behavior of OFHC at room temperature, is longer than other materials at 400 C. Remarkable fatigue life reduction of OFHC found in this experiment is to be due to recrystallization of OFHC yielded above 400 C.« less

  16. Study of the Thermal Polymerization of Linseed and Passion Fruit Oils

    NASA Astrophysics Data System (ADS)

    Lopes, R. V. V.; Loureiro, N. P. D.; Fonseca, P. S.; Macedo, J. L.; Santos, M. L.; Sales, M. J.

    2008-08-01

    Researches involving ecofriendliness materials are growing up, as well as, a current interest in developing materials from inexpensive and renewable resources. Vegetable oils show a number of excellent properties, which could be utilized to produce valuable polymeric materials. In this work is described the synthesis of polymeric materials from linseed oil (Linum usitatissimum L.) and passion fruit oil (Passiflora edulis) and their characterization by thermogravimetry (TG), differential scanning calorimetry (DSC) and Raman spectroscopy. The TG curve shows that those polymeric materials present two stages of decomposition. DSC plots of the vegetable oils showed some endothermic and exothermic transitions which are not present in the DSC curves corresponding to oil-based polymers. The Raman spectra of the polymers indicate declining of absorbance in the region of C = C stretching (˜1600 cm-1). This absorption was used to estimate the degree of polymerization (79% and 67.5% for linseed and passion fruit oils, respectively)

  17. Influence of supercritical CO(2) pressurization on the phase behavior of mixed cholesteryl esters.

    PubMed

    Huang, Zhen; Feng, Mei; Su, Junfeng; Guo, Yuhua; Liu, Tie-Yan; Chiew, Yee C

    2010-09-15

    Evidences indicating the presence of phase transformations in the mixed cholesteryl benzoate (CBE) and cholesteryl butyrate (CBU) under the supercritical CO(2) pressurization, by means of differential scanning calorimetry (DSC) and X-ray diffraction (XRD), are presented in this work. These include (1) the DSC heating curve of pure CBU; (2) the DSC heating curves of CBU/CBE mixtures; (3) the XRD spectra of pure CBU; (4) the XRD spectra of CBU/CBE mixtures; (5) CBU and CBE are miscible in either solid phase or liquid phase over the whole composition range. As a result of the presence of these phase transformations induced by pressurization, it could be deduced that a solid solution of the CBU/CBE mixture might have formed at the interfaces under supercritical conditions, subsequently influencing their dissolving behaviors in supercritical CO(2). Copyright 2010 Elsevier B.V. All rights reserved.

  18. Correlation of Tumor Immunohistochemistry with Dynamic Contrast-Enhanced and DSC-MRI Parameters in Patients with Gliomas.

    PubMed

    Nguyen, T B; Cron, G O; Bezzina, K; Perdrizet, K; Torres, C H; Chakraborty, S; Woulfe, J; Jansen, G H; Thornhill, R E; Zanette, B; Cameron, I G

    2016-12-01

    Tumor CBV is a prognostic and predictive marker for patients with gliomas. Tumor CBV can be measured noninvasively with different MR imaging techniques; however, it is not clear which of these techniques most closely reflects histologically-measured tumor CBV. Our aim was to investigate the correlations between dynamic contrast-enhanced and DSC-MR imaging parameters and immunohistochemistry in patients with gliomas. Forty-three patients with a new diagnosis of glioma underwent a preoperative MR imaging examination with dynamic contrast-enhanced and DSC sequences. Unnormalized and normalized cerebral blood volume was obtained from DSC MR imaging. Two sets of plasma volume and volume transfer constant maps were obtained from dynamic contrast-enhanced MR imaging. Plasma volume obtained from the phase-derived vascular input function and bookend T1 mapping (Vp_Φ) and volume transfer constant obtained from phase-derived vascular input function and bookend T1 mapping (K trans _Φ) were determined. Plasma volume obtained from magnitude-derived vascular input function (Vp_SI) and volume transfer constant obtained from magnitude-derived vascular input function (K trans _SI) were acquired, without T1 mapping. Using CD34 staining, we measured microvessel density and microvessel area within 3 representative areas of the resected tumor specimen. The Mann-Whitney U test was used to test for differences according to grade and degree of enhancement. The Spearman correlation was performed to determine the relationship between dynamic contrast-enhanced and DSC parameters and histopathologic measurements. Microvessel area, microvessel density, dynamic contrast-enhanced, and DSC-MR imaging parameters varied according to the grade and degree of enhancement (P < .05). A strong correlation was found between microvessel area and Vp_Φ and between microvessel area and unnormalized blood volume (r s ≥ 0.61). A moderate correlation was found between microvessel area and normalized blood volume, microvessel area and Vp_SI, microvessel area and K trans _Φ, microvessel area and K trans _SI, microvessel density and Vp_Φ, microvessel density and unnormalized blood volume, and microvessel density and normalized blood volume (0.44 ≤ r s ≤ 0.57). A weaker correlation was found between microvessel density and K trans _Φ and between microvessel density and K trans _SI (r s ≤ 0.41). With dynamic contrast-enhanced MR imaging, use of a phase-derived vascular input function and bookend T1 mapping improves the correlation between immunohistochemistry and plasma volume, but not between immunohistochemistry and the volume transfer constant. With DSC-MR imaging, normalization of tumor CBV could decrease the correlation with microvessel area. © 2016 by American Journal of Neuroradiology.

  19. Comparison of atlas-based techniques for whole-body bone segmentation.

    PubMed

    Arabi, Hossein; Zaidi, Habib

    2017-02-01

    We evaluate the accuracy of whole-body bone extraction from whole-body MR images using a number of atlas-based segmentation methods. The motivation behind this work is to find the most promising approach for the purpose of MRI-guided derivation of PET attenuation maps in whole-body PET/MRI. To this end, a variety of atlas-based segmentation strategies commonly used in medical image segmentation and pseudo-CT generation were implemented and evaluated in terms of whole-body bone segmentation accuracy. Bone segmentation was performed on 23 whole-body CT/MR image pairs via leave-one-out cross validation procedure. The evaluated segmentation techniques include: (i) intensity averaging (IA), (ii) majority voting (MV), (iii) global and (iv) local (voxel-wise) weighting atlas fusion frameworks implemented utilizing normalized mutual information (NMI), normalized cross-correlation (NCC) and mean square distance (MSD) as image similarity measures for calculating the weighting factors, along with other atlas-dependent algorithms, such as (v) shape-based averaging (SBA) and (vi) Hofmann's pseudo-CT generation method. The performance evaluation of the different segmentation techniques was carried out in terms of estimating bone extraction accuracy from whole-body MRI using standard metrics, such as Dice similarity (DSC) and relative volume difference (RVD) considering bony structures obtained from intensity thresholding of the reference CT images as the ground truth. Considering the Dice criterion, global weighting atlas fusion methods provided moderate improvement of whole-body bone segmentation (DSC= 0.65 ± 0.05) compared to non-weighted IA (DSC= 0.60 ± 0.02). The local weighed atlas fusion approach using the MSD similarity measure outperformed the other strategies by achieving a DSC of 0.81 ± 0.03 while using the NCC and NMI measures resulted in a DSC of 0.78 ± 0.05 and 0.75 ± 0.04, respectively. Despite very long computation time, the extracted bone obtained from both SBA (DSC= 0.56 ± 0.05) and Hofmann's methods (DSC= 0.60 ± 0.02) exhibited no improvement compared to non-weighted IA. Finding the optimum parameters for implementation of the atlas fusion approach, such as weighting factors and image similarity patch size, have great impact on the performance of atlas-based segmentation approaches. The voxel-wise atlas fusion approach exhibited excellent performance in terms of cancelling out the non-systematic registration errors leading to accurate and reliable segmentation results. Denoising and normalization of MR images together with optimization of the involved parameters play a key role in improving bone extraction accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Optimizing LX-17 Thermal Decomposition Model Parameters with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Moore, Jason; McClelland, Matthew; Tarver, Craig; Hsu, Peter; Springer, H. Keo

    2017-06-01

    We investigate and model the cook-off behavior of LX-17 because this knowledge is critical to understanding system response in abnormal thermal environments. Thermal decomposition of LX-17 has been explored in conventional ODTX (One-Dimensional Time-to-eXplosion), PODTX (ODTX with pressure-measurement), TGA (thermogravimetric analysis), and DSC (differential scanning calorimetry) experiments using varied temperature profiles. These experimental data are the basis for developing multiple reaction schemes with coupled mechanics in LLNL's multi-physics hydrocode, ALE3D (Arbitrary Lagrangian-Eulerian code in 2D and 3D). We employ evolutionary algorithms to optimize reaction rate parameters on high performance computing clusters. Once experimentally validated, this model will be scalable to a number of applications involving LX-17 and can be used to develop more sophisticated experimental methods. Furthermore, the optimization methodology developed herein should be applicable to other high explosive materials. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC.

  1. FEDEF: A High Level Architecture Federate Development Framework

    DTIC Science & Technology

    2010-09-01

    require code changes for operability between HLA specifications. Configuration of federate requirements such as publications, subscriptions, time ... management , and management protocol should occur outside of federate source code, allowing for federate reusability without code modification and re

  2. Simulation study on ion extraction from electron cyclotron resonance ion sources

    NASA Astrophysics Data System (ADS)

    Fu, S.; Kitagawa, A.; Yamada, S.

    1994-04-01

    In order to study beam optics of NIRS-ECR ion source used in the HIMAC project, the EGUN code has been modified to make it capable of modeling ion extraction from a plasma. Two versions of the modified code are worked out with two different methods in which 1D and 2D sheath theories are used, respectively. Convergence problem of the strong nonlinear self-consistent equations is investigated. Simulations on NIRS-ECR ion source and HYPER-ECR ion source are presented in this paper, exhibiting an agreement with the experiment results.

  3. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  4. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  5. Particle model of a cylindrical inductively coupled ion source

    NASA Astrophysics Data System (ADS)

    Ippolito, N. D.; Taccogna, F.; Minelli, P.; Cavenago, M.; Veltri, P.

    2017-08-01

    In spite of the wide use of RF sources, a complete understanding of the mechanisms regulating the RF-coupling of the plasma is still lacking so self-consistent simulations of the involved physics are highly desirable. For this reason we are developing a 2.5D fully kinetic Particle-In-Cell Monte-Carlo-Collision (PIC-MCC) model of a cylindrical ICP-RF source, keeping the time step of the simulation small enough to resolve the plasma frequency scale. The grid cell dimension is now about seven times larger than the average Debye length, because of the large computational demand of the code. It will be scaled down in the next phase of the development of the code. The filling gas is Xenon, in order to minimize the time lost by the MCC collision module in the first stage of development of the code. The results presented here are preliminary, with the code already showing a good robustness. The final goal will be the modeling of the NIO1 (Negative Ion Optimization phase 1) source, operating in Padua at Consorzio RFX.

  6. Adaptive variable-length coding for efficient compression of spacecraft television data.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  7. A comparison of skyshine computational methods.

    PubMed

    Hertel, Nolan E; Sweezy, Jeremy E; Shultis, J Kenneth; Warkentin, J Karl; Rose, Zachary J

    2005-01-01

    A variety of methods employing radiation transport and point-kernel codes have been used to model two skyshine problems. The first problem is a 1 MeV point source of photons on the surface of the earth inside a 2 m tall and 1 m radius silo having black walls. The skyshine radiation downfield from the point source was estimated with and without a 30-cm-thick concrete lid on the silo. The second benchmark problem is to estimate the skyshine radiation downfield from 12 cylindrical canisters emplaced in a low-level radioactive waste trench. The canisters are filled with ion-exchange resin with a representative radionuclide loading, largely 60Co, 134Cs and 137Cs. The solution methods include use of the MCNP code to solve the problem by directly employing variance reduction techniques, the single-scatter point kernel code GGG-GP, the QADMOD-GP point kernel code, the COHORT Monte Carlo code, the NAC International version of the SKYSHINE-III code, the KSU hybrid method and the associated KSU skyshine codes.

  8. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.

    PubMed

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M

    2009-09-30

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  9. Bayesian Atmospheric Radiative Transfer (BART)Thermochemical Equilibrium Abundance (TEA) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew

    2014-11-01

    We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  10. Physical and numerical sources of computational inefficiency in integration of chemical kinetic rate equations: Etiology, treatment and prognosis

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.; Radhakrishnan, K.

    1986-01-01

    The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.

  11. The Astrophysics Source Code Library: Supporting software publication and citation

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  12. Self-consistent modeling of electron cyclotron resonance ion sources

    NASA Astrophysics Data System (ADS)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.

    2004-05-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.

  13. Some practical universal noiseless coding techniques, part 3, module PSl14,K+

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.

    1991-01-01

    The algorithmic definitions, performance characterizations, and application notes for a high-performance adaptive noiseless coding module are provided. Subsets of these algorithms are currently under development in custom very large scale integration (VLSI) at three NASA centers. The generality of coding algorithms recently reported is extended. The module incorporates a powerful adaptive noiseless coder for Standard Data Sources (i.e., sources whose symbols can be represented by uncorrelated non-negative integers, where smaller integers are more likely than the larger ones). Coders can be specified to provide performance close to the data entropy over any desired dynamic range (of entropy) above 0.75 bit/sample. This is accomplished by adaptively choosing the best of many efficient variable-length coding options to use on each short block of data (e.g., 16 samples) All code options used for entropies above 1.5 bits/sample are 'Huffman Equivalent', but they require no table lookups to implement. The coding can be performed directly on data that have been preprocessed to exhibit the characteristics of a standard source. Alternatively, a built-in predictive preprocessor can be used where applicable. This built-in preprocessor includes the familiar 1-D predictor followed by a function that maps the prediction error sequences into the desired standard form. Additionally, an external prediction can be substituted if desired. A broad range of issues dealing with the interface between the coding module and the data systems it might serve are further addressed. These issues include: multidimensional prediction, archival access, sensor noise, rate control, code rate improvements outside the module, and the optimality of certain internal code options.

  14. Coupled Hydrodynamic and Wave Propagation Modeling for the Source Physics Experiment: Study of Rg Wave Sources for SPE and DAG series.

    NASA Astrophysics Data System (ADS)

    Larmat, C. S.; Delorey, A.; Rougier, E.; Knight, E. E.; Steedman, D. W.; Bradley, C. R.

    2017-12-01

    This presentation reports numerical modeling efforts to improve knowledge of the processes that affect seismic wave generation and propagation from underground explosions, with a focus on Rg waves. The numerical model is based on the coupling of hydrodynamic simulation codes (Abaqus, CASH and HOSS), with a 3D full waveform propagation code, SPECFEM3D. Validation datasets are provided by the Source Physics Experiment (SPE) which is a series of highly instrumented chemical explosions at the Nevada National Security Site with yields from 100kg to 5000kg. A first series of explosions in a granite emplacement has just been completed and a second series in alluvium emplacement is planned for 2018. The long-term goal of this research is to review and improve current existing seismic sources models (e.g. Mueller & Murphy, 1971; Denny & Johnson, 1991) by providing first principles calculations provided by the coupled codes capability. The hydrodynamic codes, Abaqus, CASH and HOSS, model the shocked, hydrodynamic region via equations of state for the explosive, borehole stemming and jointed/weathered granite. A new material model for unconsolidated alluvium materials has been developed and validated with past nuclear explosions, including the 10 kT 1965 Merlin event (Perret, 1971) ; Perret and Bass, 1975). We use the efficient Spectral Element Method code, SPECFEM3D (e.g. Komatitsch, 1998; 2002), and Geologic Framework Models to model the evolution of wavefield as it propagates across 3D complex structures. The coupling interface is a series of grid points of the SEM mesh situated at the edge of the hydrodynamic code domain. We will present validation tests and waveforms modeled for several SPE tests which provide evidence that the damage processes happening in the vicinity of the explosions create secondary seismic sources. These sources interfere with the original explosion moment and reduces the apparent seismic moment at the origin of Rg waves up to 20%.

  15. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  16. Safe, Multiphase Bounds Check Elimination in Java

    DTIC Science & Technology

    2010-01-28

    production of mobile code from source code, JIT compilation in the virtual ma- chine, and application code execution. The code producer uses...invariants, and inequality constraint analysis) to identify and prove redundancy of bounds checks. During class-loading and JIT compilation, the virtual...unoptimized code if the speculated invariants do not hold. The combined effect of the multiple phases is to shift the effort as- sociated with bounds

  17. Cross-Layer Design for Video Transmission over Wireless Rician Slow-Fading Channels Using an Adaptive Multiresolution Modulation and Coding Scheme

    NASA Astrophysics Data System (ADS)

    Pei, Yong; Modestino, James W.

    2007-12-01

    We describe a multilayered video transport scheme for wireless channels capable of adapting to channel conditions in order to maximize end-to-end quality of service (QoS). This scheme combines a scalable H.263+ video source coder with unequal error protection (UEP) across layers. The UEP is achieved by employing different channel codes together with a multiresolution modulation approach to transport the different priority layers. Adaptivity to channel conditions is provided through a joint source-channel coding (JSCC) approach which attempts to jointly optimize the source and channel coding rates together with the modulation parameters to obtain the maximum achievable end-to-end QoS for the prevailing channel conditions. In this work, we model the wireless links as slow-fading Rician channel where the channel conditions can be described in terms of the channel signal-to-noise ratio (SNR) and the ratio of specular-to-diffuse energy[InlineEquation not available: see fulltext.]. The multiresolution modulation/coding scheme consists of binary rate-compatible punctured convolutional (RCPC) codes used together with nonuniform phase-shift keyed (PSK) signaling constellations. Results indicate that this adaptive JSCC scheme employing scalable video encoding together with a multiresolution modulation/coding approach leads to significant improvements in delivered video quality for specified channel conditions. In particular, the approach results in considerably improved graceful degradation properties for decreasing channel SNR.

  18. Long term change in atmospheric dust absorption, dust scattering and black carbon aerosols scattering coefficient parameters over western Indian locations

    NASA Astrophysics Data System (ADS)

    Satoliya, Anil Kumar; Vyas, B. M.; Shekhawat, M. S.

    2018-05-01

    The first time satellite space based measurement of atmospheric black carbon (BC) aerosols scattering coefficient at 550nm (BC SC at 550nm), dust aerosols scattering and dust aerosols extinction coefficient (DSC at 550nm and DEC at 550nm) parameters have been used to understand their long term trend of natural and anthropogenic aerosols behavior with its close association with ground based measured precipitation parameters such as Total Rain Fall (TRF), and Total Number of Rainy Days (TNRD) for the same period over western Indian regions concerned to the primary aerosols sources of natural activities. The basic objective of this study is an attempt to investigate the inter-correlation between dust and black carbon aerosols loading characteristics with a variation of rainfall pattern parameters as indirect aerosols induced effect i.e., aerosols-cloud interaction. The black carbon aerosols generated by diverse anthropogenic or human made activities are studied by choosing of measured atmospheric BC SC at 550nm parameter, whereas desert dust mineral aerosols primarily produced by varieties of natural activities pre-dominated of dust mineral desert aerosols mainly over Thar desert influenced area of hot climate and rural tropical site are investigated by selecting DSC at 550nm and DEC at 550nm of first semi-urban site i.e., Udaipur (UDP, 24.6°N, 73.35°E, 580m above surface level (asl)) situated in southern Rajasthan part as well as over other two Great Indian Thar desert locations i.e., Jaisalmer (JSM, 26.90°N, 69.90°E, 220m asl)) and Bikaner (BKN, 28.03°N, 73.30°E, 224m asl) located in the vicinity of the Thar desert region situated in Rajasthan state of the western Indian region. The source of the present study would be collection of longer period of monthly values of the above parameters of spanning 35 years i.e., 1980 to 2015. Such types of atmospheric aerosols-cloud monsoon interaction investigation is helpful in view of understanding their direct and indirect aerosols active role of optical absorption and scattering of solar light radiation at useful wavelength 550nm as well as heating of clouds over least explored region, i.e., the Thar desert region and also away from less dust dominated influenced provinces for longer period. The analysis of the above the result would also give a clear scientific evidence of alteration in enhancement in DSC at 550nm and DEC at 550nm and BC SC at 550nm variables with simultaneous corresponding reduction in the five yearly mean precipitation activity parameters such as TRF and TNRD. It is quite evident that anthropogenic BC aerosols activity are showing the significant increasing trend at all three locations, but it is more prominent over central Thar Desert influenced regime, i.e., JSM and BKN relative to semi-urban region i.e., UDP. The systematic increasing pattern of average monthly mean value of DSC at 550nm and DEC at 550nm or increasing aerosol loading have been revealed from acquiring their lowest value in January month and the highest values in July and retained with the broad peak values in pre-monsoon months. Subsequently, their respective values reduce sharply downward from August to December onwards. The mountain value of dust aerosols parameters, i.e., DSC at 550nm and DEC at 550nm are systematically enhanced toward from UDP to BKN and then maximized at JSM. It is clearly obvious fact that the following ascending order of desert aerosols loading influenced activity in different areas has been recorded, i.e., JSM> BKN>UDP. Several other interesting features of the earth-climate change implication in reference to the altering nature of reduction of precipitation parameter pattern with simultaneous observed elevated dust aerosol and BC aerosol loading have been also noticed in the course of present investigation. Overall reduction in rainfall pattern effect with increasing of dust aerosols loading or vice versa are seen more pronounced over JSM and lees prevalence over UDP. The more detailed investigations about other interesting results of Aerosols-Indian monsoon over western Indian locations are also discussed thoroughly in this paper.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guener, M.; Gueler, E.; Aktas, H.

    Kinetic, morphological and some thermal properties of thermally induced and deformation-induced martensite were studied in a Fe-32%Ni-0.4%Cr alloy. Scanning electron microscopy (SEM), differential scanning calorimetry (DSC) and compression deformation test techniques were used for these studies. SEM observations revealed the occurrence of both athermal and isothermal martensitic transformation kinetics for producing a lenticular martensite morphology for different homogenization conditions of the prior austenite phase. The DSC measurement results showed a fair agreement with those of previous studies on ferrous alloys.

  20. Detection of Collapse and Crystallization of Saccharide, Protein, and Mannitol Formulations by Optical Fibers in Lyophilization

    PubMed Central

    Horn, Jacqueline; Friess, Wolfgang

    2018-01-01

    The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg') as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg' and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol, and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitates online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes. PMID:29435445

  1. Brain perfusion alterations in tick-borne encephalitis-preliminary report.

    PubMed

    Tyrakowska-Dadełło, Zuzanna; Tarasów, Eugeniusz; Janusek, Dariusz; Moniuszko-Malinowska, Anna; Zajkowska, Joanna; Pancewicz, Sławomir

    2018-03-01

    Magnetic resonance imaging (MRI) changes in tick-borne encephalitis (TBE) are non-specific and the pathophysiological mechanisms leading to their formation remain unclear. This study investigated brain perfusion in TBE patients using dynamic susceptibility-weighted contrast-enhanced magnetic resonance perfusion imaging (DSC-MRI perfusion). MRI scans were performed for 12 patients in the acute phase, 3-5days after the diagnosis of TBE. Conventional MRI and DSC-MRI perfusion studies were performed. Cerebral blood flow (CBF), cerebral blood volume (CBV), mean transit time (MTT), and time to peak (TTP) parametric maps were created. The bilateral frontal, parietal, and temporal subcortical regions and thalamus were selected as regions of interest. Perfusion parameters of TBE patients were compared to those of a control group. There was a slight increase in CBF and CBV, with significant prolongation of TTP in subcortical areas in the study subjects, while MTT values were comparable to those of the control group. A significant increase in thalamic CBF (p<0.001) and increased CBV (p<0.05) were observed. Increased TTP and a slight reduction in MTT were also observed within this area. The DSC-MRI perfusion study showed that TBE patients had brain perfusion disturbances, expressed mainly in the thalami. These results suggest that DSC-MRI perfusion may provide important information regarding the areas affected in TBE patients. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Detection of Collapse and Crystallization of Saccharide, Protein and Mannitol Formulations by Optical Fibers in Lyophilization

    NASA Astrophysics Data System (ADS)

    Horn, Jacqueline; Friess, Wolfgang

    2018-01-01

    The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg’) as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg’ and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitate online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes.

  3. Multivariate analysis of DSC-XRD simultaneous measurement data: a study of multistage crystalline structure changes in a linear poly(ethylene imine) thin film.

    PubMed

    Kakuda, Hiroyuki; Okada, Tetsuo; Otsuka, Makoto; Katsumoto, Yukiteru; Hasegawa, Takeshi

    2009-01-01

    A multivariate analytical technique has been applied to the analysis of simultaneous measurement data from differential scanning calorimetry (DSC) and X-ray diffraction (XRD) in order to study thermal changes in crystalline structure of a linear poly(ethylene imine) (LPEI) film. A large number of XRD patterns generated from the simultaneous measurements were subjected to an augmented alternative least-squares (ALS) regression analysis, and the XRD patterns were readily decomposed into chemically independent XRD patterns and their thermal profiles were also obtained at the same time. The decomposed XRD patterns and the profiles were useful in discussing the minute peaks in the DSC. The analytical results revealed the following changes of polymorphisms in detail: An LPEI film prepared by casting an aqueous solution was composed of sesquihydrate and hemihydrate crystals. The sesquihydrate one was lost at an early stage of heating, and the film changed into an amorphous state. Once the sesquihydrate was lost by heating, it was not recovered even when it was cooled back to room temperature. When the sample was heated again, structural changes were found between the hemihydrate and the amorphous components. In this manner, the simultaneous DSC-XRD measurements combined with ALS analysis proved to be powerful for obtaining a better understanding of the thermally induced changes of the crystalline structure in a polymer film.

  4. An adaptive distributed data aggregation based on RCPC for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hua, Guogang; Chen, Chang Wen

    2006-05-01

    One of the most important design issues in wireless sensor networks is energy efficiency. Data aggregation has significant impact on the energy efficiency of the wireless sensor networks. With massive deployment of sensor nodes and limited energy supply, data aggregation has been considered as an essential paradigm for data collection in sensor networks. Recently, distributed source coding has been demonstrated to possess several advantages in data aggregation for wireless sensor networks. Distributed source coding is able to encode sensor data with lower bit rate without direct communication among sensor nodes. To ensure reliable and high throughput transmission with the aggregated data, we proposed in this research a progressive transmission and decoding of Rate-Compatible Punctured Convolutional (RCPC) coded data aggregation with distributed source coding. Our proposed 1/2 RSC codes with Viterbi algorithm for distributed source coding are able to guarantee that, even without any correlation between the data, the decoder can always decode the data correctly without wasting energy. The proposed approach achieves two aspects in adaptive data aggregation for wireless sensor networks. First, the RCPC coding facilitates adaptive compression corresponding to the correlation of the sensor data. When the data correlation is high, higher compression ration can be achieved. Otherwise, lower compression ratio will be achieved. Second, the data aggregation is adaptively accumulated. There is no waste of energy in the transmission; even there is no correlation among the data, the energy consumed is at the same level as raw data collection. Experimental results have shown that the proposed distributed data aggregation based on RCPC is able to achieve high throughput and low energy consumption data collection for wireless sensor networks

  5. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yidong Xia; Mitch Plummer; Robert Podgorney

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less

  6. VizieR Online Data Catalog: ynogkm: code for calculating time-like geodesics (Yang+, 2014)

    NASA Astrophysics Data System (ADS)

    Yang, X.-L.; Wang, J.-C.

    2013-11-01

    Here we present the source file for a new public code named ynogkm, aim on calculating the time-like geodesics in a Kerr-Newmann spacetime fast. In the code the four Boyer-Lindquis coordinates and proper time are expressed as functions of a parameter p semi-analytically, i.e., r(p), μ(p), φ(p), t(p), and σ(p), by using the Weiers- trass' and Jacobi's elliptic functions and integrals. All of the ellip- tic integrals are computed by Carlson's elliptic integral method, which guarantees the fast speed of the code.The source Fortran file ynogkm.f90 contains three modules: constants, rootfind, ellfunction, and blcoordinates. (3 data files).

  7. Pseudo color ghost coding imaging with pseudo thermal light

    NASA Astrophysics Data System (ADS)

    Duan, De-yang; Xia, Yun-jie

    2018-04-01

    We present a new pseudo color imaging scheme named pseudo color ghost coding imaging based on ghost imaging but with multiwavelength source modulated by a spatial light modulator. Compared with conventional pseudo color imaging where there is no nondegenerate wavelength spatial correlations resulting in extra monochromatic images, the degenerate wavelength and nondegenerate wavelength spatial correlations between the idle beam and signal beam can be obtained simultaneously. This scheme can obtain more colorful image with higher quality than that in conventional pseudo color coding techniques. More importantly, a significant advantage of the scheme compared to the conventional pseudo color coding imaging techniques is the image with different colors can be obtained without changing the light source and spatial filter.

  8. Distributed single source coding with side information

    NASA Astrophysics Data System (ADS)

    Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.

    2004-01-01

    In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.

  9. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  10. Simulations of the plasma dynamics in high-current ion diodes

    NASA Astrophysics Data System (ADS)

    Boine-Frankenheim, O.; Pointon, T. D.; Mehlhorn, T. A.

    Our time-implicit fluid/Particle-In-Cell (PIC) code DYNAID [1]is applied to problems relevant for applied- B ion diode operation. We present simulations of the laser ion source, which will soon be employed on the SABRE accelerator at SNL, and of the dynamics of the anode source plasma in the applied electric and magnetic fields. DYNAID is still a test-bed for a higher-dimensional simulation code. Nevertheless, the code can already give new theoretical insight into the dynamics of plasmas in pulsed power devices.

  11. Numerical Electromagnetic Code (NEC)-Basic Scattering Code. Part I. User’s Manual.

    DTIC Science & Technology

    1979-09-01

    Command RT : 29 I. Command PG: 32 J. Command GP: 35 K. Command CG: 36 L. Command SG: 39 M. Command AM: 44 N. Conumand PR: 48 0. Command NP: 49 P...these points and con- firm the validity of the solution. 1 0 1 -.- ’----.- ... The source presently considered in the computer code is an Plec - tric...Range Input 28 * RT : Translate and/or Rotate Coordinates 29 SG: Source Geometry Input IQ TO: Test Data Generation Options 17 [IN: Units of Input U)S

  12. Nutraceutical, Anti-Inflammatory, and Immune Modulatory Effects of β-Glucan Isolated from Yeast

    PubMed Central

    Bacha, Umar; Iqbal, Sanaullah; Anjum, Aftab Ahmad

    2017-01-01

    β-Glucan is a dietary fibre, found in many natural sources, and controls chronic metabolic diseases effectively. However, β-glucan from the yeast has rarely been investigated. Objectively, conditions were optimized to isolate β-glucan from the yeast (max. 66% yield); those optimized conditions included 1.0 M NaOH, pH 7.0, and 90°C. The purity and identity of the isolated β-glucan were characterized through FT-IR, SEM, DSC, and physicofunctional properties. The obtained results from DSC revealed highly stable β-glucan (m.p., 125°C) with antioxidant activity (TAC value 0.240 ± 0.0021 µg/mg, H2O2 scavenging 38%), which has promising bile acid binding 40.463% and glucose control (in vitro). In line with these results, we evaluated the in vivo anti-inflammatory potential, that is, myeloperoxidase activity and reduction in MDA and NO; protective effect on proteins and keeping viscosity within normal range exhibited improvement. Also, the in vivo cholesterol binding and reduction in the skin thickness by β-glucan were highly encouraging. Finally, our results confirmed that yeast β-glucan is effective against some of the inflammatory and oxidative stress markers studied in this investigation. In general, the effect of 4%  β-glucan was more noticeable versus 2%  β-glucan. Therefore, our results support the utilization of β-glucan as a novel, economically cheap, and functional food ingredient. PMID:28913359

  13. Colloidal graphene quantum dots incorporated with a Cobalt electrolyte in a dye sensitized solar cell

    NASA Astrophysics Data System (ADS)

    Lim, Hyuna

    The utilization of sun light as a renewable energy source has been pursued for a long time, but the ultimate goal of developing inexpensive and highly efficient photovoltaic devices remains elusive. To address this problem, colloidal graphene quantum dots (GQDs) were synthesized and used as a new sensitizer in dye sensitized solar cells (DSCs). Not only do the GQDs have a well-defined structure, but their large absorptivity, tunable bandgap, and size- and functional group-dependent redox potentials make them promising candidates for photovoltaic applications. Because volatile organic solvents in electrolyte solutions hinder long-term use and mass production of DSC devices, imidazolium based ionic liquids (ILs) were investigated. Cobalt-bipyridine complexes were successfully synthesized and characterized for use as new redox shuttles in DSCs. In the tested DSCs, J-V (current density-voltage) curves illustrate that the short circuit current and fill factor decrease significantly as the active area in the TiO2 photo anode increases. Dark current measurement indicated that the diode factor is bigger than one, which is different from the conventional p-n junction type solar cells, due to the high efficiency of photoelectron injection. The variation of the diode factor in dark and in light would show various types of recombination behaviors in DSCs. The performance of the DSC stained by GQDs incorporated with the cobalt redox couple was tested, but further study to improve the efficiency and to understand photochemical reaction in the DSCs is needed.

  14. Thermal and rheological properties of a family of botryosphaerans produced by Botryosphaeria rhodina MAMB-05.

    PubMed

    Fonseca, Paulo R M S; Dekker, Robert F H; Barbosa, Aneli M; Silveira, Joana L M; Vasconcelos, Ana F D; Monteiro, Nilson K; Aranda-Selverio, Gabriel; da Silva, Maria de Lourdes Corradi

    2011-09-02

    Differential scanning calorimetry (DSC), thermogravimetry (TG) and Fourier-transform infra-red spectroscopy (FT-IR) analyses were performed to investigate changes in the physico-chemical properties of botryosphaerans, a family of exopolysaccharides (EPS) produced by the fungus Botryosphaeria rhodina MAMB-05 grown on glucose (EPS(GLC)), sucrose (EPS(SUC)) and fructose (EPS(FRU)). A slight endothermic transition and small mass loss attributable to the removal of water of hydration were observed in the DSC and TG analyses, respectively, for the three EPS samples. The FT-IR spectra confirmed no structural changes occurred during thermal treatment. Viscometry was utilized to obtain information on the rheological behaviour of the EPS in aqueous solutions. The Power Law and Cross Equations determined the natural pseudoplastic characteristics of the EPS. Comparatively, results obtained for EPS produced when B. rhodina MAMB-05 was grown on each of the three carbohydrate sources demonstrated similar apparent viscosity values for EPS(GLC) and EPS(SUC), while EPS(FRU) displayed the lowest apparent viscosity of the three botryosphaerans, suggesting a higher degree of ramification and lower Mw. EPS(GLC) and EPS(SUC) possessed similar degrees of ramification. The slight differences found in their viscosities can be explained by the differences in the type of branching among the three botryosphaerans, thus varying the strength of intermolecular interactions and consequently, consistency and viscosity. The physico-chemical studies of botryosphaerans represent the originality of this work, and the knowledge of these properties is an important criterion for potential applications.

  15. Effect of Extraction Method on the Oxidative Stability of Camelina Seed Oil Studied by Differential Scanning Calorimetry.

    PubMed

    Belayneh, Henok D; Wehling, Randy L; Cahoon, Edgar B; Ciftci, Ozan N

    2017-03-01

    Camelina seed is a new alternative omega-3 source attracting growing interest. However, it is susceptible to oxidation due to its high omega-3 content. The objective of this study was to improve the oxidative stability of the camelina seed oil at the extraction stage in order to eliminate or minimize the use of additive antioxidants. Camelina seed oil extracts were enriched in terms of natural antioxidants using ethanol-modified supercritical carbon dioxide (SC-CO 2 ) extraction. Oxidative stability of the camelina seed oils extracted by ethanol modified SC-CO 2 was studied by differential scanning calorimeter (DSC), and compared with cold press, hexane, and SC-CO 2 methods. Nonisothermal oxidation kinetics of the oils obtained by different extraction methods were studied by DSC at varying heating rates (2.5, 5, 10, and 15 °C/min). Increasing ethanol level in the ethanol-modified SC-CO 2 increased the oxidative stability. Based on oxidation onset temperatures (T on ), SC-CO 2 containing 10% ethanol yielded the most stable oil. Oxidative stability depended on the type and content of the polar fractions, namely, phenolic compounds and phospholipids. Phenolic compounds acted as natural antioxidants, whereas increased phospholipid contents decreased the stability. Study has shown that the oxidative stability of the oils can be improved at the extraction stage and this may eliminate the need for additive antioxidants. © 2017 Institute of Food Technologists®.

  16. Correlation of the penetration enhancement with the influence of an alcohol/tocopheryl polyethylene glycol succinate (TPGS) cosolvent system on the molecular structure of the stratum corneum of nude mouse skin as examined by microscopic FTIR/DSC

    NASA Astrophysics Data System (ADS)

    Liou, Yi-Bo; Ho, Hsiu-O.; Chen, Shin-Yi; Sheu, Ming-Thau

    2009-10-01

    Tocopheryl polyethylene glycol succinate (TPGS) is a water-soluble derivative of natural source of vitamin E, which possesses a dual nature of lipophilicity and hydrophilicity, similar to a surface-active agent. The penetration enhancement of estradiol by an ethanol and TPGS cosolvent system (EtOH/TPGS) has been confirmed. In this study, the correlation of the penetration enhancement with the influence of the EtOH/TPGS cosolvent system on biophysical changes of the stratum corneum (SC) as examined by Fourier transformation infrared spectrometry differential scanning calorimetry (FTIR/DSC) was investigated. Thermotropic changes in the asymmetrical and symmetrical C-H stretching of hydrocarbon chains of lipids, and amide I and II bands that characterize the protein structure of the SC treated with different concentrations of the EtOH/TPGS cosolvent were examined in this investigation. Results demonstrated that a strong correlation of the influence on biophysical changes of the SC treated with the EtOH/TPGS cosolvent system with the penetration enhancement of estradiol by the corresponding cosolvent system was not evident. It was concluded that the incorporation of TPGS in the cosolvent system seemed only to have insignificantly modified the structural features of the SC. It was not obvious that the penetrant had encountered these modifications resulting in an improvement in the penetration of estradiol by TPGS.

  17. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  18. Relay selection in energy harvesting cooperative networks with rateless codes

    NASA Astrophysics Data System (ADS)

    Zhu, Kaiyan; Wang, Fei

    2018-04-01

    This paper investigates the relay selection in energy harvesting cooperative networks, where the relays harvests energy from the radio frequency (RF) signals transmitted by a source, and the optimal relay is selected and uses the harvested energy to assist the information transmission from the source to its destination. Both source and the selected relay transmit information using rateless code, which allows the destination recover original information after collecting codes bits marginally surpass the entropy of original information. In order to improve transmission performance and efficiently utilize the harvested power, the optimal relay is selected. The optimization problem are formulated to maximize the achievable information rates of the system. Simulation results demonstrate that our proposed relay selection scheme outperform other strategies.

  19. Test Generator for MATLAB Simulations

    NASA Technical Reports Server (NTRS)

    Henry, Joel

    2011-01-01

    MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.

  20. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  1. The Need for Vendor Source Code at NAS. Revised

    NASA Technical Reports Server (NTRS)

    Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.

  2. Power Balance and Impurity Studies in TCS

    NASA Astrophysics Data System (ADS)

    Grossnickle, J. A.; Pietrzyk, Z. A.; Vlases, G. C.

    2003-10-01

    A "zero-dimension" power balance model was developed based on measurements of absorbed power, radiated power, absolute D_α, temperature, and density for the TCS device. Radiation was determined to be the dominant source of power loss for medium to high density plasmas. The total radiated power was strongly correlated with the Oxygen line radiation. This suggests Oxygen is the dominant radiating species, which was confirmed by doping studies. These also extrapolate to a Carbon content below 1.5%. Determining the source of the impurities is an important question that must be answered for the TCS upgrade. Preliminary indications are that the primary sources of Oxygen are the stainless steel end cones. A Ti gettering system is being installed to reduce this Oxygen source. A field line code has been developed for use in tracking where open field lines terminate on the walls. Output from this code is also used to generate grids for an impurity tracking code.

  3. Status report on the development of a tubular electron beam ion source

    NASA Astrophysics Data System (ADS)

    Donets, E. D.; Donets, E. E.; Becker, R.; Liljeby, L.; Rensfelt, K.-G.; Beebe, E. N.; Pikin, A. I.

    2004-05-01

    The theoretical estimations and numerical simulations of tubular electron beams in both beam and reflex mode of source operation as well as the off-axis ion extraction from a tubular electron beam ion source (TEBIS) are presented. Numerical simulations have been done with the use of the IGUN and OPERA-3D codes. Numerical simulations with IGUN code show that the effective electron current can reach more than 100 A with a beam current density of about 300-400 A/cm2 and the electron energy in the region of several KeV with a corresponding increase of the ion output. Off-axis ion extraction from the TEBIS, being the nonaxially symmetric problem, was simulated with OPERA-3D (SCALA) code. The conceptual design and main parameters of new tubular sources which are under consideration at JINR, MSL, and BNL are based on these simulations.

  4. Growth and characterization of Na2Mo2O7 crystal scintillators for rare event searches

    NASA Astrophysics Data System (ADS)

    Pandey, Indra Raj; Kim, H. J.; Kim, Y. D.

    2017-12-01

    Disodium dimolybdate (Na2Mo2O7) crystals were grown using the Czochralski technique. The thermal characteristics of the compound were analyzed using thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC) measurements. The crystal structure of the grown sample was confirmed using X-ray diffraction (XRD). Luminescence properties were measured at room and low temperatures, using a light emitting diode (LED) source. Very weak luminescence was observed at room temperature; however, the luminescence intensity was enhanced at low temperatures. The crystal's transmittance spectrum was measured for estimating its optical quality and energy band gap. The grown crystal exhibited a luminescence light yield of 55% compared with CaMoO4 crystals at 10 K, when excited by a 280-nm-wavelength LED source, but does not have the drawbacks of radioactive Ca isotopes. These results suggest that at cryogenic temperatures, Na2Mo2O7 crystal scintillators are promising for the detection of dark matter and neutrinoless double beta decay of 100Mo.

  5. Temperature-responsive grafted polymer brushes obtained from renewable sources with potential application as substrates for tissue engineering

    NASA Astrophysics Data System (ADS)

    Raczkowska, Joanna; Stetsyshyn, Yurij; Awsiuk, Kamil; Lekka, Małgorzata; Marzec, Monika; Harhay, Khrystyna; Ohar, Halyna; Ostapiv, Dmytro; Sharan, Mykola; Yaremchuk, Iryna; Bodnar, Yulia; Budkowski, Andrzej

    2017-06-01

    The novel temperature-responsive poly(cholesteryl methacylate) (PChMa) coatings derived from renewable sources were synthesized and characterized. Temperature induced changes in wettability were accompanied by surface roughness modifications, traced with AFM. Topographies recorded for temperatures increasing from 5 to 25 °C showed a slight but noticeable increase of calculated root mean square (RMS) roughness by a factor of 1.5, suggesting a horizontal rearrangement in the structure of PChMa coatings. Another structural reordering was observed in the 55-85 °C temperature range. The recorded topography changed noticeably from smooth at 55 °C to very structured and rough at 60 °C and returned eventually to relatively smooth at 85 °C. In addition, temperature transitions of PChMa molecules were revealed by DSC measurements. The biocompatibility of the PChMa-grafted coatings was shown for cultures of granulosa cells and a non malignant bladder cancer cell (HCV29 line) culture.

  6. Optimization of DSC MRI Echo Times for CBV Measurements Using Error Analysis in a Pilot Study of High-Grade Gliomas.

    PubMed

    Bell, L C; Does, M D; Stokes, A M; Baxter, L C; Schmainda, K M; Dueck, A C; Quarles, C C

    2017-09-01

    The optimal TE must be calculated to minimize the variance in CBV measurements made with DSC MR imaging. Simulations can be used to determine the influence of the TE on CBV, but they may not adequately recapitulate the in vivo heterogeneity of precontrast T2*, contrast agent kinetics, and the biophysical basis of contrast agent-induced T2* changes. The purpose of this study was to combine quantitative multiecho DSC MRI T2* time curves with error analysis in order to compute the optimal TE for a traditional single-echo acquisition. Eleven subjects with high-grade gliomas were scanned at 3T with a dual-echo DSC MR imaging sequence to quantify contrast agent-induced T2* changes in this retrospective study. Optimized TEs were calculated with propagation of error analysis for high-grade glial tumors, normal-appearing white matter, and arterial input function estimation. The optimal TE is a weighted average of the T2* values that occur as a contrast agent bolus transverses a voxel. The mean optimal TEs were 30.0 ± 7.4 ms for high-grade glial tumors, 36.3 ± 4.6 ms for normal-appearing white matter, and 11.8 ± 1.4 ms for arterial input function estimation (repeated-measures ANOVA, P < .001). Greater heterogeneity was observed in the optimal TE values for high-grade gliomas, and mean values of all 3 ROIs were statistically significant. The optimal TE for the arterial input function estimation is much shorter; this finding implies that quantitative DSC MR imaging acquisitions would benefit from multiecho acquisitions. In the case of a single-echo acquisition, the optimal TE prescribed should be 30-35 ms (without a preload) and 20-30 ms (with a standard full-dose preload). © 2017 by American Journal of Neuroradiology.

  7. Effectiveness of dye sensitised solar cell under low light condition using wide band dye

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahmer, Ahmad Zahrin, E-mail: ahmadzsahmer@gmail.com; Mohamed, Norani Muti, E-mail: noranimuti-mohamed@petronas.com.my; Zaine, Siti Nur Azella, E-mail: ct.azella@gmail.com

    2015-07-22

    Dye sensistised solar cell (DSC) based on nanocrystalline TiO{sub 2} has the potential to be used in indoor consumer power application. In realizing this, the DSC must be optimized to generate power under low lighting condition and under wider visible light range. The use of wide band dye N749 which has a wider spectrum sensitivity increases the photon conversion to electron between the visible light spectrums of 390nm to 700nm. This paper reports the study on the effectiveness of the dye solar cell with N749 dye under low light condition in generating usable power which can be used for indoormore » consumer application. The DSC was fabricated using fluorine doped tin oxide (FTO) glass with screen printing method and the deposited TiO{sub 2} film was sintered at 500°C. The TiO{sub 2} coated FTO glass was then soaked in the N749 dye, assembled into test cell, and tested under the standard test condition at irradiance of 1000 W/m{sup 2} with AM1.5 solar soaker. The use of the 43T mesh for the dual pass screen printing TiO{sub 2} paste gives a uniform TiO{sub 2} film layer of 16 µm. The low light condition was simulated using 1/3 filtered irradiance with the solar soaker. The fabricated DSC test cell with the N749 dye was found to have a higher efficiency of 6.491% under low light condition compared to the N719 dye. Under the standard test condition at 1 sun the N749 test cell efficiency is 4.55%. The increases in efficiency is attributed to the wider spectral capture of photon of the DSC with N749 dye. Furthermore, the use of N749 dye is more effective under low light condition as the V{sub OC} decrement is less significant compared to the latter.« less

  8. Comparison of [{sup 11}C]choline Positron Emission Tomography With T2- and Diffusion-Weighted Magnetic Resonance Imaging for Delineating Malignant Intraprostatic Lesions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Joe H.; University of Melbourne, Victoria; Lim Joon, Daryl

    2015-06-01

    Purpose: The purpose of this study was to compare the accuracy of [{sup 11}C]choline positron emission tomography (CHOL-PET) with that of the combination of T2-weighted and diffusion-weighted (T2W/DW) magnetic resonance imaging (MRI) for delineating malignant intraprostatic lesions (IPLs) for guiding focal therapies and to investigate factors predicting the accuracy of CHOL-PET. Methods and Materials: This study included 21 patients who underwent CHOL-PET and T2W/DW MRI prior to radical prostatectomy. Two observers manually delineated IPL contours for each scan, and automatic IPL contours were generated on CHOL-PET based on varying proportions of the maximum standardized uptake value (SUV). IPLs identified onmore » prostatectomy specimens defined reference standard contours. The imaging-based contours were compared with the reference standard contours using Dice similarity coefficient (DSC), and sensitivity and specificity values. Factors that could potentially predict the DSC of the best contouring method were analyzed using linear models. Results: The best automatic contouring method, 60% of the maximum SUV (SUV{sub 60}) , had similar correlations (DSC: 0.59) with the manual PET contours (DSC: 0.52, P=.127) and significantly better correlations than the manual MRI contours (DSC: 0.37, P<.001). The sensitivity and specificity values were 72% and 71% for SUV{sub 60}; 53% and 86% for PET manual contouring; and 28% and 92% for MRI manual contouring. The tumor volume and transition zone pattern could independently predict the accuracy of CHOL-PET. Conclusions: CHOL-PET is superior to the combination of T2W/DW MRI for delineating IPLs. The accuracy of CHOL-PET is insufficient for gland-sparing focal therapies but may be accurate enough for focal boost therapies. The transition zone pattern is a new classification that may predict how well CHOL-PET delineates IPLs.« less

  9. Brain Gliomas: Multicenter Standardized Assessment of Dynamic Contrast-enhanced and Dynamic Susceptibility Contrast MR Images.

    PubMed

    Anzalone, Nicoletta; Castellano, Antonella; Cadioli, Marcello; Conte, Gian Marco; Cuccarini, Valeria; Bizzi, Alberto; Grimaldi, Marco; Costa, Antonella; Grillea, Giovanni; Vitali, Paolo; Aquino, Domenico; Terreni, Maria Rosa; Torri, Valter; Erickson, Bradley J; Caulo, Massimo

    2018-06-01

    Purpose To evaluate the feasibility of a standardized protocol for acquisition and analysis of dynamic contrast material-enhanced (DCE) and dynamic susceptibility contrast (DSC) magnetic resonance (MR) imaging in a multicenter clinical setting and to verify its accuracy in predicting glioma grade according to the new World Health Organization 2016 classification. Materials and Methods The local research ethics committees of all centers approved the study, and informed consent was obtained from patients. One hundred patients with glioma were prospectively examined at 3.0 T in seven centers that performed the same preoperative MR imaging protocol, including DCE and DSC sequences. Two independent readers identified the perfusion hotspots on maps of volume transfer constant (K trans ), plasma (v p ) and extravascular-extracellular space (v e ) volumes, initial area under the concentration curve, and relative cerebral blood volume (rCBV). Differences in parameters between grades and molecular subtypes were assessed by using Kruskal-Wallis and Mann-Whitney U tests. Diagnostic accuracy was evaluated by using receiver operating characteristic curve analysis. Results The whole protocol was tolerated in all patients. Perfusion maps were successfully obtained in 94 patients. An excellent interreader reproducibility of DSC- and DCE-derived measures was found. Among DCE-derived parameters, v p and v e had the highest accuracy (are under the receiver operating characteristic curve [A z ] = 0.847 and 0.853) for glioma grading. DSC-derived rCBV had the highest accuracy (A z = 0.894), but the difference was not statistically significant (P > .05). Among lower-grade gliomas, a moderate increase in both v p and rCBV was evident in isocitrate dehydrogenase wild-type tumors, although this was not significant (P > .05). Conclusion A standardized multicenter acquisition and analysis protocol of DCE and DSC MR imaging is feasible and highly reproducible. Both techniques showed a comparable, high diagnostic accuracy for grading gliomas. © RSNA, 2018 Online supplemental material is available for this article.

  10. Comparison of Clinical Results and Injury Risk of Posterior Tibial Cortex Between Attune and Press Fit Condylar Sigma Knee Systems.

    PubMed

    Song, Sang Jun; Park, Cheol Hee; Liang, Hu; Kang, Se Gu; Park, Jong Jun; Bae, Dae Kyung

    2018-02-01

    We compared clinical and radiographic results after total knee arthroplasty (TKA) using Attune and Press Fit Condylar Sigma, and investigated whether use of the current prosthesis increased injury risk to the tibial cortex in Asian patients. We also assessed whether a preoperative posterior tibial slope angle (PSA) is associated with the injury when using the current prosthesis. The 300 TKAs with Attune (group A) were compared to the 300 TKAs with Press Fit Condylar Sigma (group B). Demographics were not different, except follow-up periods (24.8 vs 33.3 months, P < .001). The Western Ontario and McMaster Universities Index and range of motion were compared. A minimum distance between tibial component stem and posterior tibial cortex (mDSC) was compared. The correlation between preoperative PSA and mDSC was analyzed in group A. The postoperative Western Ontario and McMaster Universities Index and range of motion of group A were better than those of group B (17.7 vs 18.8, P = .004; 131.4° vs 129.0°, P = .008). The mDSC was shorter in group A (6.3 vs 7.0 mm, P < .001), which made up a higher proportion of the high-risk group for posterior tibial cortical injury with an mDSC of <4 mm (20.0% vs 10.7%, P = .002). A negative correlation was found between the preoperative PSA and mDSC in group A (r = -0.205, P < .001). The TKA using the current prosthesis provided more satisfactory results than the TKA using the previous prosthesis. However, the injury risk to the posterior tibial cortex increased in the knees with a large PSA when using the current prosthesis for Asian patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Temperature-modulated DSC provides new insight about nickel-titanium wire transformations.

    PubMed

    Brantley, William A; Iijima, Masahiro; Grentzer, Thomas H

    2003-10-01

    Differential scanning calorimetry (DSC) is a well-known method for investigating phase transformations in nickel-titanium orthodontic wires; the microstructural phases and phase transformations in these wires have central importance for their clinical performance. The purpose of this study was to use the more recently developed technique of temperature-modulated DSC (TMDSC) to gain insight into transformations in 3 nickel-titanium orthodontic wires: Neo Sentalloy (GAC International, Islandia, NY), 35 degrees C Copper Ni-Ti (Ormco, Glendora, Calif) and Nitinol SE (3M Unitek, Monrovia, Calif). In the oral environment, the first 2 superelastic wires have shape memory, and the third wire has superelastic behavior but not shape memory. All wires had cross-section dimensions of 0.016 x 0.022 in. Archwires in the as-received condition and after bending 135 degrees were cut into 5 or 6 segments for test specimens. TMDSC analyses (Model 2910 DSC, TA Instruments, Wilmington, Del) were conducted between -125 degrees C and 100 degrees C, using a linear heating and cooling rate of 2 degrees C per min, an oscillation amplitude of 0.318 degrees C with a period of 60 seconds, and helium as the purge gas. For all 3 wire alloys, strong low-temperature martensitic transformations, resolved on the nonreversing heat-flow curves, were not present on the reversing heat-flow curves, and bending appeared to increase the enthalpy change for these peaks in some cases. For Neo Sentalloy, TMDSC showed that transformation between martensitic and austenitic nickel-titanium, suggested as occurring directly in the forward and reverse directions by conventional DSC, was instead a 2-step process involving the R-phase. Two-step transformations in the forward and reverse directions were also found for 35 degrees C Copper Ni-Ti and Nitinol SE. The TMDSC results show that structural transformations in these wires are complex. Some possible clinical implications of these observations are discussed.

  12. Pseudo-extravasation rate constant of dynamic susceptibility contrast-MRI determined from pharmacokinetic first principles.

    PubMed

    Li, Xin; Varallyay, Csanad G; Gahramanov, Seymur; Fu, Rongwei; Rooney, William D; Neuwelt, Edward A

    2017-11-01

    Dynamic susceptibility contrast-magnetic resonance imaging (DSC-MRI) is widely used to obtain informative perfusion imaging biomarkers, such as the relative cerebral blood volume (rCBV). The related post-processing software packages for DSC-MRI are available from major MRI instrument manufacturers and third-party vendors. One unique aspect of DSC-MRI with low-molecular-weight gadolinium (Gd)-based contrast reagent (CR) is that CR molecules leak into the interstitium space and therefore confound the DSC signal detected. Several approaches to correct this leakage effect have been proposed throughout the years. Amongst the most popular is the Boxerman-Schmainda-Weisskoff (BSW) K 2 leakage correction approach, in which the K 2 pseudo-first-order rate constant quantifies the leakage. In this work, we propose a new method for the BSW leakage correction approach. Based on the pharmacokinetic interpretation of the data, the commonly adopted R 2 * expression accounting for contributions from both intravascular and extravasating CR components is transformed using a method mathematically similar to Gjedde-Patlak linearization. Then, the leakage rate constant (K L ) can be determined as the slope of the linear portion of a plot of the transformed data. Using the DSC data of high-molecular-weight (~750 kDa), iron-based, intravascular Ferumoxytol (FeO), the pharmacokinetic interpretation of the new paradigm is empirically validated. The primary objective of this work is to empirically demonstrate that a linear portion often exists in the graph of the transformed data. This linear portion provides a clear definition of the Gd CR pseudo-leakage rate constant, which equals the slope derived from the linear segment. A secondary objective is to demonstrate that transformed points from the initial transient period during the CR wash-in often deviate from the linear trend of the linearized graph. The inclusion of these points will have a negative impact on the accuracy of the leakage rate constant, and even make it time dependent. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Plasma Separation Process: Betacell (BCELL) code: User's manual. [Bipolar barrier junction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taherzadeh, M.

    1987-11-13

    The emergence of clearly defined applications for (small or large) amounts of long-life and reliable power sources has given the design and production of betavoltaic systems a new life. Moreover, because of the availability of the plasma separation program, (PSP) at TRW, it is now possible to separate the most desirable radioisotopes for betacell power generating devices. A computer code, named BCELL, has been developed to model the betavoltaic concept by utilizing the available up-to-date source/cell parameters. In this program, attempts have been made to determine the betacell energy device maximum efficiency, degradation due to the emitting source radiation andmore » source/cell lifetime power reduction processes. Additionally, comparison is made between the Schottky and PN junction devices for betacell battery design purposes. Certain computer code runs have been made to determine the JV distribution function and the upper limit of the betacell generated power for specified energy sources. A Ni beta emitting radioisotope was used for the energy source and certain semiconductors were used for the converter subsystem of the betacell system. Some results for a Promethium source are also given here for comparison. 16 refs.« less

  14. Coding Instead of Splitting - Algebraic Combinations in Time and Space

    DTIC Science & Technology

    2016-06-09

    sources message. For certain classes of two-unicast-Z networks, we show that the rate-tuple ( N ,1) is achievable as long as the individual source...destination cuts for the two source-destination pairs are respectively at least as large as N and 1, and the generalized network sharing cut - a bound...previously defined by Kamath et. al. - is at least as large as N + 1. We show this through a novel achievable scheme which is based on random linear coding at

  15. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  16. 75 FR 14331 - Disaster Assistance Loan Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... meet current building code requirements. If your business is a major source of employment, SBA may..., granting tax exemption under sections 510(c), (d), or (e) of the Internal Revenue Code of 1954, or (2...; 8:45 am] BILLING CODE 8025-01-P ...

  17. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  18. Physical Properties of Hydrogenated Dimers of Norbornadiene and Exo-Tetrahydrodicyclopentadiene and their Mixtures

    DTIC Science & Technology

    1977-12-01

    structures of these com- pounds are shown in Fig. 1, along with the abbreviations (HXX, XTHDCPD , etc.) by which they will be referred to in the...crystal because of the difficulty in obtaining good DSC data at the very low temperatures below the XTHDCPD melting point. The four NBD hydrogenated...estimated from the DSC fusion curve, was used for ACpf for XTHDCPD . The results of these calculations are shown in Fig. 3 as plots of Ti (in K and *F) versus

  19. High Performance Composites Based on Polyurethanes Reinforced with Polydiacetylenes

    DTIC Science & Technology

    1989-04-04

    Mv Niax triol LHT240 (ex. Union Carbide) is a polyoxypropylene adduct of 1,2,6- hexanetriol and after drying by rotary film evaporation had an...hompolyurethane hard segment material, HDD/MDI, which has been quench -cooled from 280 to -1000C: after DSC measurement on the same material giving the...feature in the DSC curves fig 15(c) for HDD/MDI is the development of a glass-transition at 8500 in curve B’ following quench -cooling. The ladder-like, hard

  20. Highly selective BSA imprinted polyacrylamide hydrogels facilitated by a metal-coding MIP approach.

    PubMed

    El-Sharif, H F; Yapati, H; Kalluru, S; Reddy, S M

    2015-12-01

    We report the fabrication of metal-coded molecularly imprinted polymers (MIPs) using hydrogel-based protein imprinting techniques. A Co(II) complex was prepared using (E)-2-((2 hydrazide-(4-vinylbenzyl)hydrazono)methyl)phenol; along with iron(III) chloroprotoporphyrin (Hemin), vinylferrocene (VFc), zinc(II) protoporphyrin (ZnPP) and protoporphyrin (PP), these complexes were introduced into the MIPs as co-monomers for metal-coding of non-metalloprotein imprints. Results indicate a 66% enhancement for bovine serum albumin (BSA) protein binding capacities (Q, mg/g) via metal-ion/ligand exchange properties within the metal-coded MIPs. Specifically, Co(II)-complex-based MIPs exhibited 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/non-imprinted (NIP) control). The selectivity of our Co(II)-coded BSA MIPs were also tested using bovine haemoglobin (BHb), lysozyme (Lyz), and trypsin (Tryp). By evaluating imprinting factors (K), each of the latter proteins was found to have lower affinities in comparison to cognate BSA template. The hydrogels were further characterised by thermal analysis and differential scanning calorimetry (DSC) to assess optimum polymer composition. The development of hydrogel-based molecularly imprinted polymer (HydroMIPs) technology for the memory imprinting of proteins and for protein biosensor development presents many possibilities, including uses in bio-sample clean-up or selective extraction, replacement of biological antibodies in immunoassays and biosensors for medicine and the environment. Biosensors for proteins and viruses are currently expensive to develop because they require the use of expensive antibodies. Because of their biomimicry capabilities (and their potential to act as synthetic antibodies), HydroMIPs potentially offer a route to the development of new low-cost biosensors. Herein, a metal ion-mediated imprinting approach was employed to metal-code our hydrogel-based MIPs for the selective recognition of bovine serum albumin (BSA). Specifically, Co(II)-complex based MIPs exhibited a 66% enhancement (in comparison to our normal MIPs) exhibiting 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/ non-imprinted (NIP) control). The proposed metal-coded MIPs for protein recognition are intended to lead to unprecedented improvement in MIP selectivity and for future biosensor development that rely on an electrochemical redox processes. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  1. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Connor, Evan, E-mail: evanoconnor@ncsu.edu; CITA, Canadian Institute for Theoretical Astrophysics, Toronto, M5S 3H8

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrinomore » transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.« less

  2. Towards Holography via Quantum Source-Channel Codes.

    PubMed

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-14

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  3. Towards Holography via Quantum Source-Channel Codes

    NASA Astrophysics Data System (ADS)

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-01

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  4. A Conference on Spacecraft Charging Technology - 1978, held at U.S. Air Force Academy, Colorado Springs, Colorado, October 31 - November 2, 1978.

    DTIC Science & Technology

    1978-01-01

    complex, applications of the code . NASCAP CODE DESCRIPTION The NASCAP code is a finite-element spacecraft-charging simulation that is written in FORTRAN ...transport code POEM (ref. 1), is applicable to arbitrary dielectrics, source spectra, and current time histories. The code calculations are illustrated by...iaxk ’. Vlbouced _DstributionL- 9TNA Availability Codes %ELECTEf Nationa Aeronautics and Dist. Spec al TAvalland/or. MAY 2 21980 Space Administration

  5. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  6. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    NASA Astrophysics Data System (ADS)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that this input can be provided reliably by the NINJA code.

  7. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 2; Scattering Plots

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This second volume of Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code provides the scattering plots referenced by Volume 1. There are 648 plots. Half are for the 8750 rpm "high speed" operating condition and the other half are for the 7031 rpm "mid speed" operating condition.

  8. Multispectral data compression through transform coding and block quantization

    NASA Technical Reports Server (NTRS)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  9. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  10. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  11. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  12. TEA: A Code Calculating Thermochemical Equilibrium Abundances

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. We tested the code against the method of Burrows & Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows & Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  13. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but withmore » higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.« less

  14. (I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.

    PubMed

    van Rijnsoever, Frank J

    2017-01-01

    I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.

  15. Monte Carlo dosimetric characterization of the Flexisource Co-60 high-dose-rate brachytherapy source using PENELOPE.

    PubMed

    Almansa, Julio F; Guerrero, Rafael; Torres, Javier; Lallena, Antonio M

    60 Co sources have been commercialized as an alternative to 192 Ir sources for high-dose-rate (HDR) brachytherapy. One of them is the Flexisource Co-60 HDR source manufactured by Elekta. The only available dosimetric characterization of this source is that of Vijande et al. [J Contemp Brachytherapy 2012; 4:34-44], whose results were not included in the AAPM/ESTRO consensus document. In that work, the dosimetric quantities were calculated as averages of the results obtained with the Geant4 and PENELOPE Monte Carlo (MC) codes, though for other sources, significant differences have been quoted between the values obtained with these two codes. The aim of this work is to perform the dosimetric characterization of the Flexisource Co-60 HDR source using PENELOPE. The MC simulation code PENELOPE (v. 2014) has been used. Following the recommendations of the AAPM/ESTRO report, the radial dose function, the anisotropy function, the air-kerma strength, the dose rate constant, and the absorbed dose rate in water have been calculated. The results we have obtained exceed those of Vijande et al. In particular, the absorbed dose rate constant is ∼0.85% larger. A similar difference is also found in the other dosimetric quantities. The effect of the electrons emitted in the decay of 60 Co, usually neglected in this kind of simulations, is significant up to the distances of 0.25 cm from the source. The systematic and significant differences we have found between PENELOPE results and the average values found by Vijande et al. point out that the dosimetric characterizations carried out with the various MC codes should be provided independently. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  16. Zipf's Law in Short-Time Timbral Codings of Speech, Music, and Environmental Sound Signals

    PubMed Central

    Haro, Martín; Serrà, Joan; Herrera, Perfecto; Corral, Álvaro

    2012-01-01

    Timbre is a key perceptual feature that allows discrimination between different sounds. Timbral sensations are highly dependent on the temporal evolution of the power spectrum of an audio signal. In order to quantitatively characterize such sensations, the shape of the power spectrum has to be encoded in a way that preserves certain physical and perceptual properties. Therefore, it is common practice to encode short-time power spectra using psychoacoustical frequency scales. In this paper, we study and characterize the statistical properties of such encodings, here called timbral code-words. In particular, we report on rank-frequency distributions of timbral code-words extracted from 740 hours of audio coming from disparate sources such as speech, music, and environmental sounds. Analogously to text corpora, we find a heavy-tailed Zipfian distribution with exponent close to one. Importantly, this distribution is found independently of different encoding decisions and regardless of the audio source. Further analysis on the intrinsic characteristics of most and least frequent code-words reveals that the most frequent code-words tend to have a more homogeneous structure. We also find that speech and music databases have specific, distinctive code-words while, in the case of the environmental sounds, this database-specific code-words are not present. Finally, we find that a Yule-Simon process with memory provides a reasonable quantitative approximation for our data, suggesting the existence of a common simple generative mechanism for all considered sound sources. PMID:22479497

  17. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.

    2013-12-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code are delivered frequently. HYPE OSC is open to everyone interested in hydrology, hydrological modeling and code development - e.g. scientists, authorities, and consultancies. By joining the HYPE OSC you get access a state-of-the-art operational hydrological model. The HYPE source code is designed to efficiently handle large scale modeling for forecast, hindcast and climate applications. The code is under constant development to improve the hydrological processes, efficiency and readability. In the beginning of 2013 we released a version with new and better modularization based on hydrological processes. This will make the code easier to understand and further develop for a new user. An important challenge in this process is to produce code that is easy for anyone to understand and work with, but still maintain the properties that make the code efficient enough for large scale applications. Input from the HYPE Open Source Community is an important source for future improvements of the HYPE model. Therefore, by joining the community you become an active part of the development, get access to the latest features and can influence future versions of the model.

  18. 40 CFR Appendix A to Subpart A of... - Tables

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... precursor of PM2.5. Table 2a to Appendix A of Subpart A—Data Elements for Reporting on Emissions From Point Sources, Where Required by 40 CFR 51.30 Data elements Every-yearreporting Three-yearreporting (1... phone number ✓ ✓ (6) FIPS code ✓ ✓ (7) Facility ID codes ✓ ✓ (8) Unit ID code ✓ ✓ (9) Process ID code...

  19. Cerebral Structure and Cognitive Performance in African Americans and European Americans With Type 2 Diabetes.

    PubMed

    Hsu, Fang-Chi; Sink, Kaycee M; Hugenschmidt, Christina E; Williamson, Jeff D; Hughes, Timothy M; Palmer, Nicholette D; Xu, Jianzhao; Smith, S Carrie; Wagner, Benjamin C; Whitlow, Christopher T; Bowden, Donald W; Maldjian, Joseph A; Divers, Jasmin; Freedman, Barry I

    2018-03-02

    African Americans typically perform worse than European Americans on cognitive testing. Contributions of cardiovascular disease (CVD) risk factors and educational quality to cognitive performance and brain volumes were compared in European Americans and African Americans with type 2 diabetes. Association between magnetic resonance imaging-determined cerebral volumes of white matter (WMV), gray matter (GMV), white matter lesions (WMLV), hippocampal GMV, and modified mini-mental state exam (3MSE), digit symbol coding (DSC), Rey Auditory Verbal Learning Test (RAVLT), Stroop, and verbal fluency performance were assessed in Diabetes Heart Study Memory in Diabetes (MIND) participants. Marginal models incorporating generalized estimating equations were employed with serial adjustment for risk factors. The sample included 520 African Americans and 684 European Americans; 56 per cent female with mean ± SD age 62.8 ± 10.3 years and diabetes duration 14.3 ± 7.8 years. Adjusting for age, sex, diabetes duration, BMI, HbA1c, total intracranial volume, scanner, statins, CVD, smoking, and hypertension, WMV (p = .001) was lower and WMLV higher in African Americans than European Americans (p = .001), with similar GMV (p = .30). Adjusting for age, sex, education, HbA1c, diabetes duration, hypertension, BMI, statins, CVD, smoking, and depression, poorer performance on 3MSE, RAVLT, and DSC were seen in African Americans (p = 6 × 10-23-7 × 10-62). Racial differences in cognitive performance were attenuated after additional adjustment for WMLV and nearly fully resolved after adjustment for wide-range achievement test (WRAT) performance (p = .0009-.65). African Americans with type 2 diabetes had higher WMLV and poorer cognitive performance than European Americans. Differences in cognitive performance were attenuated after considering WMLV and apparent poorer educational quality based on WRAT. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Nondestructive Examination Guidance for Dry Storage Casks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Ryan M.; Suffield, Sarah R.; Hirt, Evelyn H.

    In this report, an assessment of NDE methods is performed for components of NUHOMS 80 and 102 dry storage system components in an effort to assist NRC staff with review of license renewal applications. The report considers concrete components associated with the horizontal storage modules (HSMs) as well as metal components in the HSMs. In addition, the report considers the dry shielded canister (DSC). Scope is limited to NDE methods that are considered most likely to be proposed by licensees. The document, ACI 349.3R, Evaluation of Existing Nuclear Safety-Related Concrete Structures, is used as the basis for the majority ofmore » the NDE methods summarized for inspecting HSM concrete components. Two other documents, ACI 228.2R, Nondestructive Test Methods for Evaluation of Concrete in Structures, and ORNL/TM-2007/191, Inspection of Nuclear Power Plant Structure--Overview of Methods and Related Application, supplement the list with additional technologies that are considered applicable. For the canister, the ASME B&PV Code is used as the basis for NDE methods considered, along with currently funded efforts through industry (Electric Power Research Institute [EPRI]) and the U.S. Department of Energy (DOE) to develop inspection technologies for canisters. The report provides a description of HSM and DSC components with a focus on those aspects of design considered relevant to inspection. This is followed by a brief description of other concrete structural components such as bridge decks, dams, and reactor containment structures in an effort to facilitate comparison between these structures and HSM concrete components and infer which NDE methods may work best for certain HSM concrete components based on experience with these other structures. Brief overviews of the NDE methods are provided with a focus on issues and influencing factors that may impact implementation or performance. An analysis is performed to determine which NDE methods are most applicable to specific components.« less

  1. Protecting Location Privacy for Outsourced Spatial Data in Cloud Storage

    PubMed Central

    Gui, Xiaolin; An, Jian; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC∗) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC∗ and DSC are more secure than SHC, and DSC achieves the best index generation performance. PMID:25097865

  2. Kinetics of phase transformation in glass forming systems

    NASA Technical Reports Server (NTRS)

    Ray, Chandra S.

    1994-01-01

    The objectives of this research were to (1) develop computer models for realistic simulations of nucleation and crystal growth in glasses, which would also have the flexibility to accomodate the different variables related to sample characteristics and experimental conditions, and (2) design and perform nucleation and crystallization experiments using calorimetric measurements, such as differential scanning calorimetry (DSC) and differential thermal analysis (DTA) to verify these models. The variables related to sample characteristics mentioned in (1) above include size of the glass particles, nucleating agents, and the relative concentration of the surface and internal nuclei. A change in any of these variables changes the mode of the transformation (crystallization) kinetics. A variation in experimental conditions includes isothermal and nonisothermal DSC/DTA measurements. This research would lead to develop improved, more realistic methods for analysis of the DSC/DTA peak profiles to determine the kinetic parameters for nucleation and crystal growth as well as to assess the relative merits and demerits of the thermoanalytical models presently used to study the phase transformation in glasses.

  3. Long noncoding RNAs in digestive system cancers: Functional roles, molecular mechanisms, and clinical implications (Review).

    PubMed

    Fu, Min; Zou, Chen; Pan, Lei; Liang, Wei; Qian, Hui; Xu, Wenrong; Jiang, Pengcheng; Zhang, Xu

    2016-09-01

    Long noncoding RNAs (lncRNAs) are emerging as new players in various diseases including cancer. LncRNAs have been shown to play multifaceted roles in the development, progression, and metastasis of cancer. In this review, we highlight the lncRNAs that are critically involved in the pathogenesis of digestive system cancers (DSCs). We summarize the roles of the lncRNAs in DSCs and the underlying mechanisms responsible for their functions. The DSC-associated lncRNAs interact with a wide spectrum of molecules to regulate gene expression at transcriptional, post-transcriptional, and translational levels. We also provide new insights into the clinical significance of these lncRNAs, which are found to be closely associated with the aggressiveness of DSCs and could predict the prognosis of DSC patients. Moreover, lncRNAs have been suggested as promising therapeutic targets in DSCs. Therefore, better understanding of the functional roles of lncRNAs will provide new biomarkers for DSC diagnosis, prognosis, and therapy.

  4. Protecting location privacy for outsourced spatial data in cloud storage.

    PubMed

    Tian, Feng; Gui, Xiaolin; An, Jian; Yang, Pan; Zhao, Jianqiang; Zhang, Xuejun

    2014-01-01

    As cloud computing services and location-aware devices are fully developed, a large amount of spatial data needs to be outsourced to the cloud storage provider, so the research on privacy protection for outsourced spatial data gets increasing attention from academia and industry. As a kind of spatial transformation method, Hilbert curve is widely used to protect the location privacy for spatial data. But sufficient security analysis for standard Hilbert curve (SHC) is seldom proceeded. In this paper, we propose an index modification method for SHC (SHC(∗)) and a density-based space filling curve (DSC) to improve the security of SHC; they can partially violate the distance-preserving property of SHC, so as to achieve better security. We formally define the indistinguishability and attack model for measuring the privacy disclosure risk of spatial transformation methods. The evaluation results indicate that SHC(∗) and DSC are more secure than SHC, and DSC achieves the best index generation performance.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D. C.; Gu, X.; Haldenman, S.

    The curing of cross-linkable encapsulation is a critical consideration for photovoltaic (PV) modules manufactured using a lamination process. Concerns related to ethylene-co-vinyl acetate (EVA) include the quality (e.g., expiration and uniformity) of the films or completion (duration) of the cross-linking of the EVA within a laminator. Because these issues are important to both EVA and module manufacturers, an international standard has recently been proposed by the Encapsulation Task-Group within the Working Group 2 (WG2) of the International Electrotechnical Commission (IEC) Technical Committee 82 (TC82) for the quantification of the degree of cure for EVA encapsulation. The present draft of themore » standard calls for the use of differential scanning calorimetry (DSC) as the rapid, enabling secondary (test) method. Both the residual enthalpy- and melt/freeze-DSC methods are identified. The DSC methods are calibrated against the gel content test, the primary (reference) method. Aspects of other established methods, including indentation and rotor cure metering, were considered by the group. Key details of the test procedure will be described.« less

  6. Melting temperature and enthalpy variations of phase change materials (PCMs): a differential scanning calorimetry (DSC) analysis

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoqin; Lee, Kyoung Ok; Medina, Mario A.; Chu, Youhong; Li, Chuanchang

    2018-06-01

    Differential scanning calorimetry (DSC) analysis is a standard thermal analysis technique used to determine the phase transition temperature, enthalpy, heat of fusion, specific heat and activation energy of phase change materials (PCMs). To determine the appropriate heating rate and sample mass, various DSC measurements were carried out using two kinds of PCMs, namely N-octadecane paraffin and calcium chloride hexahydrate. The variations in phase transition temperature, enthalpy, heat of fusion, specific heat and activation energy were observed within applicable heating rates and sample masses. It was found that the phase transition temperature range increased with increasing heating rate and sample mass; while the heat of fusion varied without any established pattern. The specific heat decreased with the increase of heating rate and sample mass. For accuracy purpose, it is recommended that for PCMs with high thermal conductivity (e.g. hydrated salt) the focus will be on heating rate rather than sample mass.

  7. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  8. Method for coding low entrophy data

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1995-01-01

    A method of lossless data compression for efficient coding of an electronic signal of information sources of very low information rate is disclosed. In this method, S represents a non-negative source symbol set, (s(sub 0), s(sub 1), s(sub 2), ..., s(sub N-1)) of N symbols with s(sub i) = i. The difference between binary digital data is mapped into symbol set S. Consecutive symbols in symbol set S are then paired into a new symbol set Gamma which defines a non-negative symbol set containing the symbols (gamma(sub m)) obtained as the extension of the original symbol set S. These pairs are then mapped into a comma code which is defined as a coding scheme in which every codeword is terminated with the same comma pattern, such as a 1. This allows a direct coding and decoding of the n-bit positive integer digital data differences without the use of codebooks.

  9. Water cycle algorithm: A detailed standard code

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.

  10. Flowgen: Flowchart-based documentation for C + + codes

    NASA Astrophysics Data System (ADS)

    Kosower, David A.; Lopez-Villarejo, J. J.

    2015-11-01

    We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.

  11. Study and simulation of low rate video coding schemes

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Kipp, G.

    1992-01-01

    The semiannual report is included. Topics covered include communication, information science, data compression, remote sensing, color mapped images, robust coding scheme for packet video, recursively indexed differential pulse code modulation, image compression technique for use on token ring networks, and joint source/channel coder design.

  12. A Proposed Mechanism for the Thermal Denaturation of a Recombinant Bacillus Halmapalus Alpha-amylase - the Effect of Calcium Ions

    NASA Technical Reports Server (NTRS)

    Nielsen, Anders D.; Pusey, Marc L.; Fuglsang, Claus C.; Westh, Peter

    2003-01-01

    The thermal stability of a recombinant alpha-amylase from Bacillus halmapalus alpha-amylase (BHA) has been investigated using circular dichroism spectroscopy (CD) and differential scanning calorimetry (DSC). This alpha-amylase is homologous to other Bacillus alpha-amylases where previous crystallographic studies have identified the existence of 3 calcium binding sites in the structure. Denaturation of BHA is irreversible with a Tm of approximately 89 C, and DSC thermograms can be described using a one-step irreversible model. A 5 C increase in T(sub m) in the presence of 10 fold excess CaCl2 was observed. However, a concomitant increase in the tendency to aggregate was also observed. The presence of 30-40 fold excess calcium chelator (EDTA or EGTA) results in a large destabilization of BHA corresponding to about 40 C lower T(sub m), as determined by both CD and DSC. Ten fold excess EGTA reveals complex DSC thermograms corresponding to both reversible and irreversible transitions, which possibly originate from different populations of BHA:calcium complexes. The observations in the present study have, in combination with structural information of homologous alpha-amylases, provided the basis for the proposal of a simple denaturation mechanism of BHA. The proposed mechanism describes the irreversible thermal denaturation of different BHA:calcium complexes and the calcium binding equilibrium involved. Furthermore, the model accounts for a temperature induced reversible structural change associated with calcium binding.

  13. Investigation of Phase Mixing in Amorphous Solid Dispersions of AMG 517 in HPMC-AS Using DSC, Solid-State NMR, and Solution Calorimetry.

    PubMed

    Calahan, Julie L; Azali, Stephanie C; Munson, Eric J; Nagapudi, Karthik

    2015-11-02

    Intimate phase mixing between the drug and the polymer is considered a prerequisite to achieve good physical stability for amorphous solid dispersions. In this article, spray dried amorphous dispersions (ASDs) of AMG 517 and HPMC-as were studied by differential scanning calorimetry (DSC), solid-state NMR (SSNMR), and solution calorimetry. DSC analysis showed a weakly asymmetric (ΔTg ≈ 13.5) system with a single glass transition for blends of different compositions indicating phase mixing. The Tg-composition data was modeled using the BKCV equation to accommodate the observed negative deviation from ideality. Proton spin-lattice relaxation times in the laboratory and rotating frames ((1)H T1 and T1ρ), as measured by SSNMR, were consistent with the observation that the components of the dispersion were in intimate contact over a 10-20 nm length scale. Based on the heat of mixing calculated from solution calorimetry and the entropy of mixing calculated from the Flory-Huggins theory, the free energy of mixing was calculated. The free energy of mixing was found to be positive for all ASDs, indicating that the drug and polymer are thermodynamically predisposed to phase separation at 25 °C. This suggests that miscibility measured by DSC and SSNMR is achieved kinetically as the result of intimate mixing between drug and polymer during the spray drying process. This kinetic phase mixing is responsible for the physical stability of the ASD.

  14. Saccadic interception of a moving visual target after a spatiotemporal perturbation.

    PubMed

    Fleuriet, Jérome; Goffart, Laurent

    2012-01-11

    Animals can make saccadic eye movements to intercept a moving object at the right place and time. Such interceptive saccades indicate that, despite variable sensorimotor delays, the brain is able to estimate the current spatiotemporal (hic et nunc) coordinates of a target at saccade end. The present work further tests the robustness of this estimate in the monkey when a change in eye position and a delay are experimentally added before the onset of the saccade and in the absence of visual feedback. These perturbations are induced by brief microstimulation in the deep superior colliculus (dSC). When the microstimulation moves the eyes in the direction opposite to the target motion, a correction saccade brings gaze back on the target path or very near. When it moves the eye in the same direction, the performance is more variable and depends on the stimulated sites. Saccades fall ahead of the target with an error that increases when the stimulation is applied more caudally in the dSC. The numerous cases of compensation indicate that the brain is able to maintain an accurate and robust estimate of the location of the moving target. The inaccuracies observed when stimulating the dSC that encodes the visual field traversed by the target indicate that dSC microstimulation can interfere with signals encoding the target motion path. The results are discussed within the framework of the dual-drive and the remapping hypotheses.

  15. Prospective randomized double-blind study of atlas-based organ-at-risk autosegmentation-assisted radiation planning in head and neck cancer.

    PubMed

    Walker, Gary V; Awan, Musaddiq; Tao, Randa; Koay, Eugene J; Boehling, Nicholas S; Grant, Jonathan D; Sittig, Dean F; Gunn, Gary Brandon; Garden, Adam S; Phan, Jack; Morrison, William H; Rosenthal, David I; Mohamed, Abdallah Sherif Radwan; Fuller, Clifton David

    2014-09-01

    Target volumes and organs-at-risk (OARs) for radiotherapy (RT) planning are manually defined, which is a tedious and inaccurate process. We sought to assess the feasibility, time reduction, and acceptability of an atlas-based autosegmentation (AS) compared to manual segmentation (MS) of OARs. A commercial platform generated 16 OARs. Resident physicians were randomly assigned to modify AS OAR (AS+R) or to draw MS OAR followed by attending physician correction. Dice similarity coefficient (DSC) was used to measure overlap between groups compared with attending approved OARs (DSC=1 means perfect overlap). 40 cases were segmented. Mean ± SD segmentation time in the AS+R group was 19.7 ± 8.0 min, compared to 28.5 ± 8.0 min in the MS cohort, amounting to a 30.9% time reduction (Wilcoxon p<0.01). For each OAR, AS DSC was statistically different from both AS+R and MS ROIs (all Steel-Dwass p<0.01) except the spinal cord and the mandible, suggesting oversight of AS/MS processes is required; AS+R and MS DSCs were non-different. AS compared to attending approved OAR DSCs varied considerably, with a chiasm mean ± SD DSC of 0.37 ± 0.32 and brainstem of 0.97 ± 0.03. Autosegmentation provides a time savings in head and neck regions of interest generation. However, attending physician approval remains vital. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Activity in the superior colliculus reflects dynamic interactions between voluntary and involuntary influences on orienting behaviour.

    PubMed

    Bell, Andrew H; Munoz, Douglas P

    2008-10-01

    Performance in a behavioural task can be influenced by both bottom-up and top-down processes such as stimulus modality and prior probability. Here, we exploited differences in behavioural strategy to explore the role of the intermediate and deep layers of the superior colliculus (dSC) in covert orienting. Two monkeys were trained on a predictive cued-saccade task in which the cue predicted the target's upcoming location with 80% validity. When the delay between cue and target onset was 250 ms, both monkeys showed faster responses to the uncued (Invalid) location. This was associated with a reduced target-aligned response in the dSC on Valid trials for both monkeys and is consistent with a bottom-up (i.e. involuntary) bias. When the delay was increased to 650 ms, one monkey continued to show faster responses to the Invalid location whereas the other monkey showed faster responses to the Valid location, consistent with a top-down (i.e. voluntary) bias. This latter behaviour was correlated with an increase in activity in dSC neurons preceding target onset that was absent in the other monkey. Thus, using the information provided by the cue shifted the emphasis towards top-down processing, while ignoring this information allowed bottom-up processing to continue to dominate. Regardless of the selected strategy, however, neurons in the dSC consistently reflected the current bias between the two processes, emphasizing its role in both the bottom-up and top-down control of orienting behaviour.

  17. Pushing the Performance Limit of Sub-100 nm Molybdenum Disulfide Transistors.

    PubMed

    Liu, Yuan; Guo, Jian; Wu, Yecun; Zhu, Enbo; Weiss, Nathan O; He, Qiyuan; Wu, Hao; Cheng, Hung-Chieh; Xu, Yang; Shakir, Imran; Huang, Yu; Duan, Xiangfeng

    2016-10-12

    Two-dimensional semiconductors (2DSCs) such as molybdenum disulfide (MoS 2 ) have attracted intense interest as an alternative electronic material in the postsilicon era. However, the ON-current density achieved in 2DSC transistors to date is considerably lower than that of silicon devices, and it remains an open question whether 2DSC transistors can offer competitive performance. A high current device requires simultaneous minimization of the contact resistance and channel length, which is a nontrivial challenge for atomically thin 2DSCs, since the typical low contact resistance approaches for 2DSCs either degrade the electronic properties of the channel or are incompatible with the fabrication process for short channel devices. Here, we report a new approach toward high-performance MoS 2 transistors by using a physically assembled nanowire as a lift-off mask to create ultrashort channel devices with pristine MoS 2 channel and self-aligned low resistance metal/graphene hybrid contact. With the optimized contact in short channel devices, we demonstrate sub-100 nm MoS 2 transistor delivering a record high ON-current of 0.83 mA/μm at 300 K and 1.48 mA/μm at 20 K, which compares well with that of silicon devices. Our study, for the first time, demonstrates that the 2DSC transistors can offer comparable performance to the 2017 target for silicon transistors in International Technology Roadmap for Semiconductors (ITRS), marking an important milestone in 2DSC electronics.

  18. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  19. A subset of conserved mammalian long non-coding RNAs are fossils of ancestral protein-coding genes.

    PubMed

    Hezroni, Hadas; Ben-Tov Perry, Rotem; Meir, Zohar; Housman, Gali; Lubelsky, Yoav; Ulitsky, Igor

    2017-08-30

    Only a small portion of human long non-coding RNAs (lncRNAs) appear to be conserved outside of mammals, but the events underlying the birth of new lncRNAs in mammals remain largely unknown. One potential source is remnants of protein-coding genes that transitioned into lncRNAs. We systematically compare lncRNA and protein-coding loci across vertebrates, and estimate that up to 5% of conserved mammalian lncRNAs are derived from lost protein-coding genes. These lncRNAs have specific characteristics, such as broader expression domains, that set them apart from other lncRNAs. Fourteen lncRNAs have sequence similarity with the loci of the contemporary homologs of the lost protein-coding genes. We propose that selection acting on enhancer sequences is mostly responsible for retention of these regions. As an example of an RNA element from a protein-coding ancestor that was retained in the lncRNA, we describe in detail a short translated ORF in the JPX lncRNA that was derived from an upstream ORF in a protein-coding gene and retains some of its functionality. We estimate that ~ 55 annotated conserved human lncRNAs are derived from parts of ancestral protein-coding genes, and loss of coding potential is thus a non-negligible source of new lncRNAs. Some lncRNAs inherited regulatory elements influencing transcription and translation from their protein-coding ancestors and those elements can influence the expression breadth and functionality of these lncRNAs.

  20. A Review of Single Source Precursors for the Deposition of Ternary Chalcopyrite Materials

    NASA Technical Reports Server (NTRS)

    Banger, K. K.; Cowen, J.; Harris, J.; McClarnon, R.; Hehemann, D. G.; Duraj, S. A.; Scheiman, D.; Hepp, A. F.

    2002-01-01

    The development of thin-film solar cells on flexible, lightweight, space-qualified durable substrates (i.e. Kapton) provides an attractive solution to fabricating solar arrays with high specific power, (W/kg). The syntheses and thermal modulation of ternary single source precursors, based on the [{LR}2Cu(SR')2In(SR')2] architecture in good yields are described. Thermogravimetric analyses (TGA) and Low temperature Differential Scanning Caloriometry, (DSC) demonstrate that controlled manipulation of the steric and electronic properties of either the group five-donor and/or chalcogenide moiety permits directed adjustment of the thermal stability and physical properties of the precursors. TGA-Evolved Gas Analysis, confirms that single precursors decompose by the initial extrusion of the sulphide moiety, followed by the loss of the neutral donor group, (L) to release the ternary chalcopyrite matrix. X-ray diffraction studies, EDS and SEM on the non-volatile pyrolized material demonstrate that these derivatives afford single-phase CuInS2/CuInSe2 materials at low temperature. Thin-film fabrication studies demonstrate that these single source precursors can be used in a spray chemical vapor deposition process, for depositing CuInS2 onto flexible polymer substrates at temperatures less than 400 C.

  1. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  2. Some Practical Universal Noiseless Coding Techniques

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.

    1994-01-01

    Report discusses noiseless data-compression-coding algorithms, performance characteristics and practical consideration in implementation of algorithms in coding modules composed of very-large-scale integrated circuits. Report also has value as tutorial document on data-compression-coding concepts. Coding techniques and concepts in question "universal" in sense that, in principle, applicable to streams of data from variety of sources. However, discussion oriented toward compression of high-rate data generated by spaceborne sensors for lower-rate transmission back to earth.

  3. 10 CFR 851.27 - Reference sources.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) American Society of Mechanical Engineers (ASME), P.O. Box 2300 Fairfield, NJ 07007. Telephone: 800-843-2763... Electrical Code,” (2005). (5) NFPA 70E, “Standard for Electrical Safety in the Workplace,” (2004). (6... Engineers (ASME) Boilers and Pressure Vessel Code, sections I through XII including applicable Code Cases...

  4. 10 CFR 851.27 - Reference sources.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) American Society of Mechanical Engineers (ASME), P.O. Box 2300 Fairfield, NJ 07007. Telephone: 800-843-2763... Electrical Code,” (2005). (5) NFPA 70E, “Standard for Electrical Safety in the Workplace,” (2004). (6... Engineers (ASME) Boilers and Pressure Vessel Code, sections I through XII including applicable Code Cases...

  5. Leveraging Code Comments to Improve Software Reliability

    ERIC Educational Resources Information Center

    Tan, Lin

    2009-01-01

    Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…

  6. 48 CFR 501.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...

  7. 48 CFR 501.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...

  8. 48 CFR 501.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES... 501.105-1 Publication and code arrangement. The GSAR is published in the following sources: (a) Daily...

  9. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be presented. These simulations highlight the code features included in the latest release of WEC-Sim (v1.2), including: wave directionality, nonlinear hydrostatics and hydrodynamics, user-defined wave elevation time-series, state space radiation, and WEC-Sim compatibility with BEMIO (open source AQWA/WAMI/NEMOH coefficient parser).

  10. Utilization of recently developed codes for high power Brayton and Rankine cycle power systems

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.

    1993-01-01

    Two recently developed FORTRAN computer codes for high power Brayton and Rankine thermodynamic cycle analysis for space power applications are presented. The codes were written in support of an effort to develop a series of subsystem models for multimegawatt Nuclear Electric Propulsion, but their use is not limited just to nuclear heat sources or to electric propulsion. Code development background, a description of the codes, some sample input/output from one of the codes, and state future plans/implications for the use of these codes by NASA's Lewis Research Center are provided.

  11. New coding advances for deep space communications

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H.

    1987-01-01

    Advances made in error-correction coding for deep space communications are described. The code believed to be the best is a (15, 1/6) convolutional code, with maximum likelihood decoding; when it is concatenated with a 10-bit Reed-Solomon code, it achieves a bit error rate of 10 to the -6th, at a bit SNR of 0.42 dB. This code outperforms the Voyager code by 2.11 dB. The use of source statics in decoding convolutionally encoded Voyager images from the Uranus encounter is investigated, and it is found that a 2 dB decoding gain can be achieved.

  12. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  13. A source-channel coding approach to digital image protection and self-recovery.

    PubMed

    Sarreshtedari, Saeed; Akhaee, Mohammad Ali

    2015-07-01

    Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes.

  14. Benchmarking Defmod, an open source FEM code for modeling episodic fault rupture

    NASA Astrophysics Data System (ADS)

    Meng, Chunfang

    2017-03-01

    We present Defmod, an open source (linear) finite element code that enables us to efficiently model the crustal deformation due to (quasi-)static and dynamic loadings, poroelastic flow, viscoelastic flow and frictional fault slip. Ali (2015) provides the original code introducing an implicit solver for (quasi-)static problem, and an explicit solver for dynamic problem. The fault constraint is implemented via Lagrange Multiplier. Meng (2015) combines these two solvers into a hybrid solver that uses failure criteria and friction laws to adaptively switch between the (quasi-)static state and dynamic state. The code is capable of modeling episodic fault rupture driven by quasi-static loadings, e.g. due to reservoir fluid withdraw or injection. Here, we focus on benchmarking the Defmod results against some establish results.

  15. Fast computation of quadrupole and hexadecapole approximations in microlensing with a single point-source evaluation

    NASA Astrophysics Data System (ADS)

    Cassan, Arnaud

    2017-07-01

    The exoplanet detection rate from gravitational microlensing has grown significantly in recent years thanks to a great enhancement of resources and improved observational strategy. Current observatories include ground-based wide-field and/or robotic world-wide networks of telescopes, as well as space-based observatories such as satellites Spitzer or Kepler/K2. This results in a large quantity of data to be processed and analysed, which is a challenge for modelling codes because of the complexity of the parameter space to be explored and the intensive computations required to evaluate the models. In this work, I present a method that allows to compute the quadrupole and hexadecapole approximations of the finite-source magnification with more efficiency than previously available codes, with routines about six times and four times faster, respectively. The quadrupole takes just about twice the time of a point-source evaluation, which advocates for generalizing its use to large portions of the light curves. The corresponding routines are available as open-source python codes.

  16. Fast in-memory elastic full-waveform inversion using consumer-grade GPUs

    NASA Astrophysics Data System (ADS)

    Sivertsen Bergslid, Tore; Birger Raknes, Espen; Arntsen, Børge

    2017-04-01

    Full-waveform inversion (FWI) is a technique to estimate subsurface properties by using the recorded waveform produced by a seismic source and applying inverse theory. This is done through an iterative optimization procedure, where each iteration requires solving the wave equation many times, then trying to minimize the difference between the modeled and the measured seismic data. Having to model many of these seismic sources per iteration means that this is a highly computationally demanding procedure, which usually involves writing a lot of data to disk. We have written code that does forward modeling and inversion entirely in memory. A typical HPC cluster has many more CPUs than GPUs. Since FWI involves modeling many seismic sources per iteration, the obvious approach is to parallelize the code on a source-by-source basis, where each core of the CPU performs one modeling, and do all modelings simultaneously. With this approach, the GPU is already at a major disadvantage in pure numbers. Fortunately, GPUs can more than make up for this hardware disadvantage by performing each modeling much faster than a CPU. Another benefit of parallelizing each individual modeling is that it lets each modeling use a lot more RAM. If one node has 128 GB of RAM and 20 CPU cores, each modeling can use only 6.4 GB RAM if one is running the node at full capacity with source-by-source parallelization on the CPU. A parallelized per-source code using GPUs can use 64 GB RAM per modeling. Whenever a modeling uses more RAM than is available and has to start using regular disk space the runtime increases dramatically, due to slow file I/O. The extremely high computational speed of the GPUs combined with the large amount of RAM available for each modeling lets us do high frequency FWI for fairly large models very quickly. For a single modeling, our GPU code outperforms the single-threaded CPU-code by a factor of about 75. Successful inversions have been run on data with frequencies up to 40 Hz for a model of 2001 by 600 grid points with 5 m grid spacing and 5000 time steps, in less than 2.5 minutes per source. In practice, using 15 nodes (30 GPUs) to model 101 sources, each iteration took approximately 9 minutes. For reference, the same inversion run with our CPU code uses two hours per iteration. This was done using only a very simple wavefield interpolation technique, saving every second timestep. Using a more sophisticated checkpointing or wavefield reconstruction method would allow us to increase this model size significantly. Our results show that ordinary gaming GPUs are a viable alternative to the expensive professional GPUs often used today, when performing large scale modeling and inversion in geophysics.

  17. Calorimetric analysis of fungal degraded wood

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blankenhorn, P.R.; Baldwin, R.C.; Merrill, W. Jr.

    1980-01-01

    Endothermic transition and gross heat of combustion of aspenwood subjected to degradation by Lenzites trabea and Polyporus versicolor were determined by using differential scanning calorimetry (DSC) and an adiabatic O bomb. Endothermic peak areas of undegraded and fungi-degraded wood differed from each other at all levels of weight loss. The regression analysis of the DSC data vs. weight loss revealed a significant relations, although not highly correlated, for P. versicolor-degraded specimens and a nonsignificant relation for L. trabea-degraded specimens; weight loss and gross heat of combustion values of degraded specimens were significantly correlated.

  18. Phase transitions in methyl parben doped dipalmitoyl phosphatidylethanolamine vesicles

    NASA Astrophysics Data System (ADS)

    Panicker, Lata

    2013-02-01

    Influence of the preservative, methyl paraben (MPB), on the thermal properties of dipalmitoyl phosphatidylethanolamine (DPPE) vesicles was investigated using DSC. DSC measurement of the lipid acyl chain melting transition in DPPE membrane doped with MPB, showed MPB concentration dependant modifications in the membrane thermal properties. The interesting findings are: (1) the presence of parabens increases the membrane fluidity. (2) the MPB molecules seem to be present in the aqueous bilayer interfacial region intercalated between the neighboring lipid polar headgroup (3) high concentration of MPB favored formation of crystalline and glassy phases.

  19. Metastable and equilibrium phase formation in sputter-deposited Ti/Al multilayer thin films

    NASA Astrophysics Data System (ADS)

    Lucadamo, G.; Barmak, K.; Lavoie, C.; Cabral, C., Jr.; Michaelsen, C.

    2002-06-01

    The sequence and kinetics of metastable and equilibrium phase formation in sputter deposited multilayer thin films was investigated by combining in situ synchrotron x-ray diffraction (XRD) with ex situ electron diffraction and differential scanning calorimetry (DSC). The sequence included both cubic and tetragonal modifications of the equilibrium TiAl3 crystal structure. Values for the formation activation energies of the various phases in the sequence were determined using the XRD and DSC data obtained here, as well as activation energy data reported in the literature.

  20. Synthesis of Amorphous Powders of Ni-Si and Co-Si Alloys by Mechanical Alloying

    NASA Astrophysics Data System (ADS)

    Omuro, Keisuke; Miura, Harumatsu

    1991-05-01

    Amorphous powders of the Ni-Si and Co-Si alloys are synthesized by mechanical alloying (MA) from crystalline elemental powders using a high energy ball mill. The alloying and amorphization process is examined by X-ray diffraction, differential scanning calorimetry (DSC), and scanning electron microscopy. For the Ni-Si alloy, it is confirmed that the crystallization temperature of the MA powder, measured by DSC, is in good agreement with that of the powder sample prepared by mechanical grinding from the cast alloy ingot products of the same composition.

  1. Final report for tank 241-AP-108, grab samples 8AP-96-1, 8AP-96-2 and 8AP-96-FB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esch, R.A.

    1996-04-19

    This document is the final report deliverable for the tank 241-AP-108 grab samples. The samples were subsampled and analyzed in accordance with the TSAP. Included in this report are the results for the Waste Compatibility analyses, with the exception of DSC and thermogravimetric analysis (TGA) results which were presented in the 45 Day report (Part 2 of this document). The raw data for all analyses, with the exception of DSC and TGA, are also included in this report.

  2. Electrical properties of CZTS thin films

    NASA Astrophysics Data System (ADS)

    Rao, M. C.; Kumar, M. Seshu; Lakshmi, K.; Rao, K. Koteswara; Parimala, M. P. D.; Basha, S. K. Shahenoor

    2018-05-01

    CZTS (Cu2ZnSnS4) thin films have been coated on to FTO and MO glass substrates by single step electro deposition process. Different characterization techniques were performed on to the prepared samples such as DSC and Raman studies. The Phase transition and weight loss of the precursors can be measured by DSC analysis. Raman spectrum is used to identify the functional groups and chemical structure involved in the materials. Electrical measurements confirm the nature of the film and also depend on the charge concentration present in the samples.

  3. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. Themore » code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.« less

  4. [Random Variable Read Me File

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Sankararaman, Shankar; Cullo, Aiden

    2017-01-01

    Readme for the Random Variable Toolbox usable manner. is a Web-based Git version control repository hosting service. It is mostly used for computer code. It offers all of the distributed version control and source code management (SCM) functionality of Git as well as adding its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project.[3] GitHub offers both plans for private and free repositories on the same account[4] which are commonly used to host open-source software projects.[5] As of April 2017, GitHub reports having almost 20 million users and 57 million repositories,[6] making it the largest host of source code in the world.[7] GitHub has a mascot called Octocat, a cat with five tentacles and a human-like face

  5. Constructing graph models for software system development and analysis

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, Andrey V.

    2017-01-01

    We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.

  6. Effect of a 24-Month Physical Activity Intervention vs Health Education on Cognitive Outcomes in Sedentary Older Adults: The LIFE Randomized Trial.

    PubMed

    Sink, Kaycee M; Espeland, Mark A; Castro, Cynthia M; Church, Timothy; Cohen, Ron; Dodson, John A; Guralnik, Jack; Hendrie, Hugh C; Jennings, Janine; Katula, Jeffery; Lopez, Oscar L; McDermott, Mary M; Pahor, Marco; Reid, Kieran F; Rushing, Julia; Verghese, Joe; Rapp, Stephen; Williamson, Jeff D

    2015-08-25

    Epidemiological evidence suggests that physical activity benefits cognition, but results from randomized trials are limited and mixed. To determine whether a 24-month physical activity program results in better cognitive function, lower risk of mild cognitive impairment (MCI) or dementia, or both, compared with a health education program. A randomized clinical trial, the Lifestyle Interventions and Independence for Elders (LIFE) study, enrolled 1635 community-living participants at 8 US centers from February 2010 until December 2011. Participants were sedentary adults aged 70 to 89 years who were at risk for mobility disability but able to walk 400 m. A structured, moderate-intensity physical activity program (n = 818) that included walking, resistance training, and flexibility exercises or a health education program (n = 817) of educational workshops and upper-extremity stretching. Prespecified secondary outcomes of the LIFE study included cognitive function measured by the Digit Symbol Coding (DSC) task subtest of the Wechsler Adult Intelligence Scale (score range: 0-133; higher scores indicate better function) and the revised Hopkins Verbal Learning Test (HVLT-R; 12-item word list recall task) assessed in 1476 participants (90.3%). Tertiary outcomes included global and executive cognitive function and incident MCI or dementia at 24 months. At 24 months, DSC task and HVLT-R scores (adjusted for clinic site, sex, and baseline values) were not different between groups. The mean DSC task scores were 46.26 points for the physical activity group vs 46.28 for the health education group (mean difference, -0.01 points [95% CI, -0.80 to 0.77 points], P = .97). The mean HVLT-R delayed recall scores were 7.22 for the physical activity group vs 7.25 for the health education group (mean difference, -0.03 words [95% CI, -0.29 to 0.24 words], P = .84). No differences for any other cognitive or composite measures were observed. Participants in the physical activity group who were 80 years or older (n = 307) and those with poorer baseline physical performance (n = 328) had better changes in executive function composite scores compared with the health education group (P = .01 for interaction for both comparisons). Incident MCI or dementia occurred in 98 participants (13.2%) in the physical activity group and 91 participants (12.1%) in the health education group (odds ratio, 1.08 [95% CI, 0.80 to 1.46]). Among sedentary older adults, a 24-month moderate-intensity physical activity program compared with a health education program did not result in improvements in global or domain-specific cognitive function. clinicaltrials.gov Identifier: NCT01072500.

  7. SiC JFET Transistor Circuit Model for Extreme Temperature Range

    NASA Technical Reports Server (NTRS)

    Neudeck, Philip G.

    2008-01-01

    A technique for simulating extreme-temperature operation of integrated circuits that incorporate silicon carbide (SiC) junction field-effect transistors (JFETs) has been developed. The technique involves modification of NGSPICE, which is an open-source version of the popular Simulation Program with Integrated Circuit Emphasis (SPICE) general-purpose analog-integrated-circuit-simulating software. NGSPICE in its unmodified form is used for simulating and designing circuits made from silicon-based transistors that operate at or near room temperature. Two rapid modifications of NGSPICE source code enable SiC JFETs to be simulated to 500 C using the well-known Level 1 model for silicon metal oxide semiconductor field-effect transistors (MOSFETs). First, the default value of the MOSFET surface potential must be changed. In the unmodified source code, this parameter has a value of 0.6, which corresponds to slightly more than half the bandgap of silicon. In NGSPICE modified to simulate SiC JFETs, this parameter is changed to a value of 1.6, corresponding to slightly more than half the bandgap of SiC. The second modification consists of changing the temperature dependence of MOSFET transconductance and saturation parameters. The unmodified NGSPICE source code implements a T(sup -1.5) temperature dependence for these parameters. In order to mimic the temperature behavior of experimental SiC JFETs, a T(sup -1.3) temperature dependence must be implemented in the NGSPICE source code. Following these two simple modifications, the Level 1 MOSFET model of the NGSPICE circuit simulation program reasonably approximates the measured high-temperature behavior of experimental SiC JFETs properly operated with zero or reverse bias applied to the gate terminal. Modification of additional silicon parameters in the NGSPICE source code was not necessary to model experimental SiC JFET current-voltage performance across the entire temperature range from 25 to 500 C.

  8. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  9. A Counterexample Guided Abstraction Refinement Framework for Verifying Concurrent C Programs

    DTIC Science & Technology

    2005-05-24

    source code are routinely executed. The source code is written in languages ranging from C/C++/Java to ML/ Ocaml . These languages differ not only in...from the difficulty to model computer programs—due to the complexity of programming languages as compared to hardware description languages —to...intermediate specification language lying between high-level Statechart- like formalisms and transition systems. Actions are encoded as changes in

  10. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  11. Programming in HAL/S

    NASA Technical Reports Server (NTRS)

    Ryer, M. J.

    1978-01-01

    HAL/S is a computer programming language; it is a representation for algorithms which can be interpreted by either a person or a computer. HAL/S compilers transform blocks of HAL/S code into machine language which can then be directly executed by a computer. When the machine language is executed, the algorithm specified by the HAL/S code (source) is performed. This document describes how to read and write HAL/S source.

  12. Scoping Calculations of Power Sources for Nuclear Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Difilippo, F. C.

    1994-01-01

    This technical memorandum describes models and calculational procedures to fully characterize the nuclear island of power sources for nuclear electric propulsion. Two computer codes were written: one for the gas-cooled NERVA derivative reactor and the other for liquid metal-cooled fuel pin reactors. These codes are going to be interfaced by NASA with the balance of plant in order to make scoping calculations for mission analysis.

  13. Automated Discovery of Machine-Specific Code Improvements

    DTIC Science & Technology

    1984-12-01

    operation of the source language. Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient...Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient code. Such analysis is optional...incorporate knowledge of the source language, but do not refer to features of the target machine. These early phases are sometimes referred to as the

  14. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  15. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  16. (I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research

    PubMed Central

    2017-01-01

    I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358

  17. ASTROPOP: ASTROnomical Polarimetry and Photometry pipeline

    NASA Astrophysics Data System (ADS)

    Campagnolo, Julio C. N.

    2018-05-01

    AstroPoP reduces almost any CCD photometry and image polarimetry data. For photometry reduction, the code performs source finding, aperture and PSF photometry, astrometry calibration using different automated and non-automated methods and automated source identification and magnitude calibration based on online and local catalogs. For polarimetry, the code resolves linear and circular Stokes parameters produced by image beam splitter or polarizer polarimeters. In addition to the modular functions, ready-to-use pipelines based in configuration files and header keys are also provided with the code. AstroPOP was initially developed to reduce the IAGPOL polarimeter data installed at Observatório Pico dos Dias (Brazil).

  18. Application discussion of source coding standard in voyage data recorder

    NASA Astrophysics Data System (ADS)

    Zong, Yonggang; Zhao, Xiandong

    2018-04-01

    This paper analyzes the disadvantages of the audio and video compression coding technology used by Voyage Data Recorder, and combines the improvement of performance of audio and video acquisition equipment. The thinking of improving the audio and video compression coding technology of the voyage data recorder is proposed, and the feasibility of adopting the new compression coding technology is analyzed from economy and technology two aspects.

  19. TIME-DEPENDENT MULTI-GROUP MULTI-DIMENSIONAL RELATIVISTIC RADIATIVE TRANSFER CODE BASED ON SPHERICAL HARMONIC DISCRETE ORDINATE METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I., E-mail: tominaga@konan-u.ac.jp, E-mail: sshibata@post.kek.jp, E-mail: Sergei.Blinnikov@itep.ru

    We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source functionmore » is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.« less

  20. Finite-SNR analysis for partial relaying cooperation with channel coding and opportunistic relay selection

    NASA Astrophysics Data System (ADS)

    Vu, Thang X.; Duhamel, Pierre; Chatzinotas, Symeon; Ottersten, Bjorn

    2017-12-01

    This work studies the performance of a cooperative network which consists of two channel-coded sources, multiple relays, and one destination. To achieve high spectral efficiency, we assume that a single time slot is dedicated to relaying. Conventional network-coded-based cooperation (NCC) selects the best relay which uses network coding to serve the two sources simultaneously. The bit error rate (BER) performance of NCC with channel coding, however, is still unknown. In this paper, we firstly study the BER of NCC via a closed-form expression and analytically show that NCC only achieves diversity of order two regardless of the number of available relays and the channel code. Secondly, we propose a novel partial relaying-based cooperation (PARC) scheme to improve the system diversity in the finite signal-to-noise ratio (SNR) regime. In particular, closed-form expressions for the system BER and diversity order of PARC are derived as a function of the operating SNR value and the minimum distance of the channel code. We analytically show that the proposed PARC achieves full (instantaneous) diversity order in the finite SNR regime, given that an appropriate channel code is used. Finally, numerical results verify our analysis and demonstrate a large SNR gain of PARC over NCC in the SNR region of interest.

Top