Neuroimaging of Brain Injuries and Disorders at Cleveland Clinic
2013-12-01
LFBFs) in the human brain during a state of alert rest. These spontaneous fluctuations are correlated in brain regions with a high degree of connectivity...behavioral performance on the finger tapping fMRI scan. One subject was excluded due to significantly different anatomy from the group (large ventricles...FSL 4.0 image processing software package (http://www.fmrib.ox.ac.uk/fsl/tbss/index.html) to create a mean white matter skeleton . Specifically, all
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python.
Gorgolewski, Krzysztof; Burns, Christopher D; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O; Waskom, Michael L; Ghosh, Satrajit S
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research.
Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python
Gorgolewski, Krzysztof; Burns, Christopher D.; Madison, Cindee; Clark, Dav; Halchenko, Yaroslav O.; Waskom, Michael L.; Ghosh, Satrajit S.
2011-01-01
Current neuroimaging software offer users an incredible opportunity to analyze their data in different ways, with different underlying assumptions. Several sophisticated software packages (e.g., AFNI, BrainVoyager, FSL, FreeSurfer, Nipy, R, SPM) are used to process and analyze large and often diverse (highly multi-dimensional) data. However, this heterogeneous collection of specialized applications creates several issues that hinder replicable, efficient, and optimal use of neuroimaging analysis approaches: (1) No uniform access to neuroimaging analysis software and usage information; (2) No framework for comparative algorithm development and dissemination; (3) Personnel turnover in laboratories often limits methodological continuity and training new personnel takes time; (4) Neuroimaging software packages do not address computational efficiency; and (5) Methods sections in journal articles are inadequate for reproducing results. To address these issues, we present Nipype (Neuroimaging in Python: Pipelines and Interfaces; http://nipy.org/nipype), an open-source, community-developed, software package, and scriptable library. Nipype solves the issues by providing Interfaces to existing neuroimaging software with uniform usage semantics and by facilitating interaction between these packages using Workflows. Nipype provides an environment that encourages interactive exploration of algorithms, eases the design of Workflows within and between packages, allows rapid comparative development of algorithms and reduces the learning curve necessary to use different packages. Nipype supports both local and remote execution on multi-core machines and clusters, without additional scripting. Nipype is Berkeley Software Distribution licensed, allowing anyone unrestricted usage. An open, community-driven development philosophy allows the software to quickly adapt and address the varied needs of the evolving neuroimaging community, especially in the context of increasing demand for reproducible research. PMID:21897815
Sharing brain mapping statistical results with the neuroimaging data model
Maumet, Camille; Auer, Tibor; Bowring, Alexander; Chen, Gang; Das, Samir; Flandin, Guillaume; Ghosh, Satrajit; Glatard, Tristan; Gorgolewski, Krzysztof J.; Helmer, Karl G.; Jenkinson, Mark; Keator, David B.; Nichols, B. Nolan; Poline, Jean-Baptiste; Reynolds, Richard; Sochat, Vanessa; Turner, Jessica; Nichols, Thomas E.
2016-01-01
Only a tiny fraction of the data and metadata produced by an fMRI study is finally conveyed to the community. This lack of transparency not only hinders the reproducibility of neuroimaging results but also impairs future meta-analyses. In this work we introduce NIDM-Results, a format specification providing a machine-readable description of neuroimaging statistical results along with key image data summarising the experiment. NIDM-Results provides a unified representation of mass univariate analyses including a level of detail consistent with available best practices. This standardized representation allows authors to relay methods and results in a platform-independent regularized format that is not tied to a particular neuroimaging software package. Tools are available to export NIDM-Result graphs and associated files from the widely used SPM and FSL software packages, and the NeuroVault repository can import NIDM-Results archives. The specification is publically available at: http://nidm.nidash.org/specs/nidm-results.html. PMID:27922621
Rane, Swati; Plassard, Andrew; Landman, Bennett A.; Claassen, Daniel O.; Donahue, Manus J.
2017-01-01
This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer’s Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson’s Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer’s and Parkinson’s diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages. PMID:29756095
Rane, Swati; Plassard, Andrew; Landman, Bennett A; Claassen, Daniel O; Donahue, Manus J
2017-01-01
This work explores the feasibility of combining anatomical MRI data across two public repositories namely, the Alzheimer's Disease Neuroimaging Initiative (ADNI) and the Progressive Parkinson's Markers Initiative (PPMI). We compared cortical thickness and subcortical volumes in cognitively normal older adults between datasets with distinct imaging parameters to assess if they would provide equivalent information. Three distinct datasets were identified. Major differences in data were scanner manufacturer and the use of magnetization inversion to enhance tissue contrast. Equivalent datasets, i.e., those providing similar volumetric measurements in cognitively normal controls, were identified in ADNI and PPMI. These were datasets obtained on the Siemens scanner with TI = 900 ms. Our secondary goal was to assess the agreement between subcortical volumes that are obtained with different software packages. Three subcortical measurement applications (FSL, FreeSurfer, and a recent multi-atlas approach) were compared. Our results show significant agreement in the measurements of caudate, putamen, pallidum, and hippocampus across the packages and poor agreement between measurements of accumbens and amygdala. This is likely due to their smaller size and lack of gray matter-white matter tissue contrast for accurate segmentation. This work provides a segue to combine imaging data from ADNI and PPMI to increase statistical power as well as to interrogate common mechanisms in disparate pathologies such as Alzheimer's and Parkinson's diseases. It lays the foundation for comparison of anatomical data acquired with disparate imaging parameters and analyzed with disparate software tools. Furthermore, our work partly explains the variability in the results of studies using different software packages.
Suppa, Per; Hampel, Harald; Kepp, Timo; Lange, Catharina; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2016-01-01
MRI-based hippocampus volume, a core feasible biomarker of Alzheimer's disease (AD), is not yet widely used in clinical patient care, partly due to lack of validation of software tools for hippocampal volumetry that are compatible with routine workflow. Here, we evaluate fully-automated and computationally efficient hippocampal volumetry with FSL-FIRST for prediction of AD dementia (ADD) in subjects with amnestic mild cognitive impairment (aMCI) from phase 1 of the Alzheimer's Disease Neuroimaging Initiative. Receiver operating characteristic analysis of FSL-FIRST hippocampal volume (corrected for head size and age) revealed an area under the curve of 0.79, 0.70, and 0.70 for prediction of aMCI-to-ADD conversion within 12, 24, or 36 months, respectively. Thus, FSL-FIRST provides about the same power for prediction of progression to ADD in aMCI as other volumetry methods.
Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox
Sato, João R.; Basilio, Rodrigo; Paiva, Fernando F.; Garrido, Griselda J.; Bramati, Ivanei E.; Bado, Patricia; Tovar-Moll, Fernanda; Zahn, Roland; Moll, Jorge
2013-01-01
The demonstration that humans can learn to modulate their own brain activity based on feedback of neurophysiological signals opened up exciting opportunities for fundamental and applied neuroscience. Although EEG-based neurofeedback has been long employed both in experimental and clinical investigation, functional MRI (fMRI)-based neurofeedback emerged as a promising method, given its superior spatial resolution and ability to gauge deep cortical and subcortical brain regions. In combination with improved computational approaches, such as pattern recognition analysis (e.g., Support Vector Machines, SVM), fMRI neurofeedback and brain decoding represent key innovations in the field of neuromodulation and functional plasticity. Expansion in this field and its applications critically depend on the existence of freely available, integrated and user-friendly tools for the neuroimaging research community. Here, we introduce FRIEND, a graphic-oriented user-friendly interface package for fMRI neurofeedback and real-time multivoxel pattern decoding. The package integrates routines for image preprocessing in real-time, ROI-based feedback (single-ROI BOLD level and functional connectivity) and brain decoding-based feedback using SVM. FRIEND delivers an intuitive graphic interface with flexible processing pipelines involving optimized procedures embedding widely validated packages, such as FSL and libSVM. In addition, a user-defined visual neurofeedback module allows users to easily design and run fMRI neurofeedback experiments using ROI-based or multivariate classification approaches. FRIEND is open-source and free for non-commercial use. Processing tutorials and extensive documentation are available. PMID:24312569
Reproducibility of neuroimaging analyses across operating systems
Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757
Reproducibility of neuroimaging analyses across operating systems.
Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C
2015-01-01
Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.
Final Syllable Lengthening (FSL) in infant vocalizations.
Nathani, Suneeti; Oller, D Kimbrough; Cobo-Lewis, Alan B
2003-02-01
Final Syllable Lengthening (FSL) has been extensively examined in infant vocalizations in order to determine whether its basis is biological or learned. Findings suggest there may be a U-shaped developmental trajectory for FSL. The present study sought to verify this pattern and to determine whether vocal maturity and deafness influence FSL. Eight normally hearing infants, aged 0;3 to 1;0, and eight deaf infants, aged 0;8 to 4;0, were examined at three levels of prelinguistic vocal development: precanonical, canonical, and postcanonical. FSL was found at all three levels suggesting a biological basis for this phenomenon. Individual variability was, however, considerable. Reduction in the magnitude of FSL across the three sessions provided some support for a downward trend for FSL in infancy. Findings further indicated that auditory deprivation can significantly affect temporal aspects of infant speech production.
Head circumference as a useful surrogate for intracranial volume in older adults.
Hshieh, Tammy T; Fox, Meaghan L; Kosar, Cyrus M; Cavallari, Michele; Guttmann, Charles R G; Alsop, David; Marcantonio, Edward R; Schmitt, Eva M; Jones, Richard N; Inouye, Sharon K
2016-01-01
Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for ICV in older adults. 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Head circumference and ICV by SPM8 were moderately correlated (overall r = 0.73, men r = 0.67, women r = 0.63). Head circumference and ICV by FSL were also moderately correlated (overall r = 0.69, men r = 0.63, women r = 0.49). Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan.
Brand, Sarel Jacobus; Harvey, Brian Herbert
2017-08-01
Co-morbid depression with post-traumatic stress disorder (PTSD) is often treatment resistant. In developing a preclinical model of treatment-resistant depression (TRD), we combined animal models of depression and PTSD to produce an animal with more severe as well as treatment-resistant depressive-like behaviours. Male Flinders sensitive line (FSL) rats, a genetic animal model of depression, were exposed to a stress re-stress model of PTSD [time-dependent sensitisation (TDS)] and compared with stress-naive controls. Seven days after TDS stress, depressive-like and coping behaviours as well as hippocampal and cortical noradrenaline (NA) and 5-hydroxyindoleacetic acid (5HIAA) levels were analysed. Response to sub-chronic imipramine treatment (IMI; 10 mg/kg s.c.×7 days) was subsequently studied. FSL rats demonstrated bio-behavioural characteristics of depression. Exposure to TDS stress in FSL rats correlated negatively with weight gain, while demonstrating reduced swimming behaviour and increased immobility versus unstressed FSL rats. IMI significantly reversed depressive-like (immobility) behaviour and enhanced active coping behaviour (swimming and climbing) in FSL rats. The latter was significantly attenuated in FSL rats exposed to TDS versus unstressed FSL rats. IMI reversed reduced 5HIAA levels in unstressed FSL rats, whereas exposure to TDS negated this effect. Lowered NA levels in FSL rats were sustained after TDS with IMI significantly reversing this in the hippocampus. Combining a gene-X-environment model of depression with a PTSD paradigm produces exaggerated depressive-like symptoms that display an attenuated response to antidepressant treatment. This work confirms combining FSL rats with TDS exposure as a putative animal model of TRD.
PANDA: a pipeline toolbox for analyzing brain diffusion images.
Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang
2013-01-01
Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.
PANDA: a pipeline toolbox for analyzing brain diffusion images
Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang
2013-01-01
Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named “Pipeline for Analyzing braiN Diffusion imAges” (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies. PMID:23439846
Kwon, Tae-Rin; Mun, Seog Kyun; Oh, Chang Taek; Hong, Hyuckki; Choi, Yeon Shik; Kim, Bong-Jun; Kim, Beom Joon
2014-01-01
Full spectrum light (FSL) includes UVA, visible light and infrared light. Many studies have investigated the application of FSL in severe cases of atopic dermatitis (AD) in humans; however, FSL has not yet been studied in an animal model. The purpose of this study was to evaluate the therapeutic effects of FSL on AD-like skin lesions using NC/Nga mice, with the aim of mitigating itching and attenuating the expression of adhesion molecules. We examined the effects of FSL on mite allergen-treated NC/Nga mice by assessing skin symptom severity, ear thickness, serum IgE levels, and the cytokine expression. We examined the histology of lesions using hematoxylin-eosin, toluidine blue and immunohistochemical staining. Our findings suggest that FSL phototherapy exerts positive therapeutic effects on Dermatophagoides farinae (Df)-induced AD-like skin lesions in NC/Nga mice by reducing IgE levels, thus promoting recovery of the skin barrier. The mechanisms by which FSL phototherapy exerts its effects may also involve the inhibition of scratching behavior, reduction of IL-6 levels and reductions in adhesion molecule expression. The present study indicates that FSL phototherapy inhibits the development of AD in NC/Nga mice by suppressing cytokine, chemokine and adhesion molecule expression, and thus, could potentially be useful in treating AD. © 2014 The American Society of Photobiology.
Head Circumference as a Useful Surrogate for Intracranial Volume in Older Adults
Hshieh, Tammy T.; Fox, Meaghan L.; Kosar, Cyrus M.; Cavallari, Michele; Guttmann, Charles R.G.; Alsop, David; Marcantonio, Edward R.; Schmitt, Eva M.; Jones, Richard N.; Inouye, Sharon K.
2015-01-01
Background Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for intracranial volume in older adults. Methods 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Results Head circumference and ICV by SPM8 were moderately correlated (overall r=0.73, men r=0.67, women r=0.63). Head circumference and ICV by FSL were also moderately correlated (overall r=0.69, men r=0.63, women r=0.49). Conclusions Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan. PMID:26631180
The Flinders Sensitive Line rat: a selectively bred putative animal model of depression.
Overstreet, David H; Friedman, Elliot; Mathé, Aleksander A; Yadid, Gal
2005-01-01
The Flinders Sensitive Line (FSL) rats were originally selectively bred for increased responses to an anticholinesterase agent. The FSL rat partially resembles depressed individuals because it exhibits reduced appetite and psychomotor function but exhibits normal hedonic responses and cognitive function. The FSL rat also exhibits sleep and immune abnormalities that are observed in depressed individuals. Neurochemical and/or pharmacological evidence suggests that the FSL rat exhibits changes consistent with the cholinergic, serotonergic, dopaminergic, NPY, and circadian rhythm models but not the noradrenergic, HPA axis or GABAergic models of depression. However, evidence for the genetic basis of these changes is lacking and it remains to be determined which, if any, of the neurochemical changes are primary to the behavioral alterations. The FSL rat model has been very useful as a screen for antidepressants because known antidepressants reduced swim test immobility when given chronically and psychomotor stimulants did not. Furthermore, rolipram and a melatonin agonist were shown to have anti-immobility effects in the FSL rats and later to have antidepressant effects in humans. Thus, the FSL rat model of depression exhibits some behavioral, neurochemical, and pharmacological features that have been reported in depressed individuals and has been very effective in detecting antidepressants.
Angunawela, Romesh I; Riau, Andri; Chaurasia, Shyam S; Tan, Donald T; Mehta, Jodhbir S
2012-05-04
To measure real-time intraocular pressure (IOP) during trephination with a manual suction trephine (MST) and the femtosecond laser (FSL), and to assess endothelial cell damage, incision geometry, and wound healing response with these procedures. IOP was monitored with an intracameral sensor. Eight rabbits underwent manual suction trephination. Eight rabbits had FSL trephination (FSL-T). Slit lamp photography, confocal microscopy, and anterior segment optical coherence tomography (AS-OCT) were performed at baseline and postoperatively. Animals were sacrificed at 4 hours and 3 days. Tissue was examined with scanning electron microscopy (SEM) and immunohistochemistry for an array of wound-healing markers. Separately, 6 human corneas had MST (3) and FSL-T (3). Incision geometry was imaged with high resolution Optovue AS-OCT. The average IOP during MST and FSL-T was similar (37 mm Hg). There was wider IOP fluctuation during the MST cutting phase (60 mm Hg maximum). There were 1-2 rows of endothelial loss on either side of the incision for FSL-T and 2-5 rows deep for MST. Immune cell responses at 4 hours (CD11b) were comparable, greater apoptosis with FSL-T (TUNEL) occurred at 4 hours, and there was increased keratocyte proliferation at 3 days (Ki67) with FSL-T. There was significantly greater undercutting of the cornea with MST (46.86 degrees versus 16.72 degrees). There is more IOP variation during MST. Average IOP is 37 mm Hg for both techniques. More endothelial damage and undercutting of the cornea occurs with MST. The wound healing response to FSL-T appears greater at 3 days.
Mahon, Katie; Burdick, Katherine E; Wu, Jinghui; Ardekani, Babak A; Szeszko, Philip R
2012-01-01
Background Impulsivity is characteristic of individuals with bipolar disorder and may be a contributing factor to the high rate of suicide in patients with this disorder. Although white matter abnormalities have been implicated in the pathophysiology of bipolar disorder, their relationship to impulsivity and suicidality in this disorder has not been well-investigated. Methods Diffusion tensor imaging scans were acquired in 14 bipolar disorder patients with a prior suicide attempt, 15 bipolar disorder patients with no prior suicide attempt, and 15 healthy volunteers. Bipolar disorder patients received clinical assessments including measures of impulsivity, depression, mania, and anxiety. Images were processed using the Tract-Based Spatial Statistics method in the FSL software package. Results Bipolar disorder patients with a prior suicide attempt had lower fractional anisotropy (FA) within the left orbital frontal white matter (p < 0.05, corrected) and higher overall impulsivity compared to patients without a previous suicide attempt. Among patients with a prior suicide attempt, FA in the orbital frontal white matter region correlated inversely with motor impulsivity. Conclusions Abnormal orbital frontal white matter may play a role in impulsive and suicidal behavior among patients with bipolar disorder. PMID:22329475
Final Syllable Lengthening (FSL) in Infant Vocalizations.
ERIC Educational Resources Information Center
Nathani, Suneeti; Oller, D. Kimbrough; Cobo-Lewis, Alan B.
2003-01-01
Sought to verify research findings that suggest there may be a U-shaped developmental trajectory for final syllable lengthening (FSL). Attempted to determine whether vocal maturity and deafness influence FSL . Eight normally hearing infants and eight deaf infants were examined at three levels of prelinguistic vocal development. (Author/VWL)
Adapted Finnegan scoring list for observation of anti-depressant exposed infants.
Kieviet, Noera; van Ravenhorst, Mariëtte; Dolman, Koert M; van de Ven, Peter M; Heres, Marion; Wennink, Hanneke; Honig, Adriaan
2015-01-01
The Finnegan scoring list (FSL) is widely used to screen for poor neonatal adaptation in infants exposed to anti-depressants in utero. However, the large number of FSL-items and differential weighing of each item is time consuming. The aim of this study was to shorten and simplify the FSL yet preserving its clinimetric properties. This observational study examined infants exposed to an anti-depressant during pregnancy admitted for at least 72 h on a maternity ward. Trained nurses completed the FSL three times daily. Items for the adapted FSL were selected through forward analysis whereby the number of selected items was based on the area under the curve (AUC). Internal validity was assessed by cross-validation. 183 infants met the inclusion criteria. By forward analysis eight equally-weighed items resulted in an AUC of 0.91. In cross-validation, the mean AUC was 0.89 for 8 items. This adapted FSL had a sensitivity of 97.7% and specificity of 37.0% and a sensitivity of 41.9% and specificity of 86.2% regarding a cut-off of, respectively, 1 and 2. An adapted FSL with eight equally-weighed items has acceptable clinimetric properties and can serve as an easy to apply screening tool in infants exposed to anti-depressants during pregnancy.
Adduru, Viraj R; Michael, Andrew M; Helguera, Maria; Baum, Stefi A; Moore, Gregory J
2017-09-01
Purpose To validate the use of thick-section clinically acquired magnetic resonance (MR) imaging data for estimating total brain volume (TBV), gray matter (GM) volume (GMV), and white matter (WM) volume (WMV) by using three widely used automated toolboxes: SPM ( www.fil.ion.ucl.ac.uk/spm/ ), FreeSurfer ( surfer.nmr.mgh.harvard.edu ), and FSL (FMRIB software library; Oxford Centre for Functional MR Imaging of the Brain, Oxford, England, https://fsl.fmrib.ox.ac.uk/fsl ). Materials and Methods MR images from a clinical archive were used and data were deidentified. The three methods were applied to estimate brain volumes from thin-section research-quality brain MR images and routine thick-section clinical MR images acquired from the same 38 patients (age range, 1-71 years; mean age, 22 years; 11 women). By using these automated methods, TBV, GMV, and WMV were estimated. Thin- versus thick-section volume comparisons were made for each method by using intraclass correlation coefficients (ICCs). Results SPM exhibited excellent ICCs (0.97, 0.85, and 0.83 for TBV, GMV, and WMV, respectively). FSL exhibited ICCs of 0.69, 0.51, and 0.60 for TBV, GMV, and WMV, respectively, but they were lower than with SPM. FreeSurfer exhibited excellent ICC of 0.63 only for TBV. Application of SPM's voxel-based morphometry on the modulated images of thin-section images and interpolated thick-section images showed fair to excellent ICCs (0.37-0.98) for the majority of brain regions (88.47% [306924 of 346916 voxels] of WM and 80.35% [377 282 of 469 502 voxels] of GM). Conclusion Thick-section clinical-quality MR images can be reliably used for computing quantitative brain metrics such as TBV, GMV, and WMV by using SPM. © RSNA, 2017 Online supplemental material is available for this article.
Jallouli, Raida; Parsiegla, Goetz; Carrière, Frédéric; Gargouri, Youssef; Bezzine, Sofiane
2017-01-01
The gene coding for a lipase of Fusarium solani, designated as FSL2, shows an open reading frame of 906bp encoding a 301-amino acid polypeptide with a molecular mass of 30kDa. Based on sequence similarity with other fungal lipases, FSL2 contains a catalytic triad, consisting of Ser144, Asp198, and His256. FSL2 cDNA was subcloned into the pGAPZαA vector containing the Saccharomyces cerevisiae α-factor signal sequence and this construct was used to transform Pichia pastoris and achieve a high-level extracellular production of a FSL2 lipase. Maximum lipase activity was observed after 48h. The optimum activity of the purified recombinant enzyme was measured at pH 8.0-9.0 and 37°C. FSL2 is remarkably stable at alkaline pH values up to 12 and at temperatures below 40°C. It has high catalytic efficiency towards triglycerides with short to long chain fatty acids but with a marked preference for medium and long chain fatty acids. FSL2 activity is decreased at sodium taurodeoxycholate concentrations above the Critical Micelle Concentration (CMC) of this anionic detergent. However, lipase activity is enhanced by Ca 2+ and inhibited by EDTA or Cu 2+ and partially by Mg 2+ or K + . In silico docking of medium chain triglycerides, monogalctolipids (MGDG), digalactolipids (DGDG) and long chain phospholipids in the active site of FSL2 reveals structural solutions. Copyright © 2016 Elsevier B.V. All rights reserved.
Cook, A; Pfeiffer, L-M; Thiele, S; Coenen, V A; Döbrössy, M D
2017-10-01
Major Depressive Disorder (MDD) is a heterogeneous psychiatric disorder with broad symptomatic manifestations. The current study examined, for the first time, olfactory memory and discrimination in the Flinders Sensitive Line (FSL) rodent model of depression. Male FSL rats and controls were trained on an Olfactory Discrimination (OD) and a Social Interaction (SI) test. On the OD test, the FSL and controls performed similarly at the shortest inter-trial interval (5min), however, with extended delay of 30min, the FSLs had a recall and odour discrimination deficit. At the longest delay (60min) both groups performed poorly. The FSL rats i.) had a deficit in olfactory discrimination suggesting impairment in olfactory memory and recall; ii.) were less likely to socialize with unfamiliar rats. The data suggests that FSL animals have an impaired olfactory information processing capacity. Copyright © 2017 Elsevier B.V. All rights reserved.
The galactolipase activity of Fusarium solani (phospho)lipase.
Jallouli, Raida; Othman, Houcemeddine; Amara, Sawsan; Parsiegla, Goetz; Carriere, Frédéric; Srairi-Abid, Najet; Gargouri, Youssef; Bezzine, Sofiane
2015-03-01
The purified (phospho)lipase of Fusarium solani (FSL), was known to be active on both triglycerides and phospholipids. This study aimed at assessing the potential of this enzyme in hydrolyzing galactolipids. FSL was found to hydrolyze at high rates of synthetic medium chains monogalactosyldiacylglycerol (4658±146U/mg on DiC8-MGDG) and digalactosyldiacylglycerol (3785±83U/mg on DiC8-DGDG) and natural long chain monogalactosyldiacylglycerol extracted from leek leaves (991±85U/mg). It is the microbial enzyme with the highest activity on galactolipids identified so far with a level of activity comparable to that of pancreatic lipase-related protein 2. FSL maximum activity on galactolipids was measured at pH8. The analysis of the hydrolysis product of natural MGDG from leek showed that FSL hydrolyzes preferentially the ester bond at the sn-1 position of galactolipids. To investigate the structure-activity relationships of FSL, a 3D model of this enzyme was built. In silico docking of medium chains MGDG and DGDG and phospholipid in the active site of FSL reveals structural solutions which are in concordance with in vitro tests. Copyright © 2015 Elsevier B.V. All rights reserved.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Laser-assisted cataract surgery: benefits and barriers.
Hatch, Kathryn M; Talamo, Jonathan H
2014-01-01
The use of the femtosecond laser (FSL) in cataract surgery may represent the largest advancement in the field since the inception of phacoemulsification. The goal of this review is to outline the benefits of and barriers to this technology. There are several significant potential benefits of the FSL in cataract surgery over conventional manual cataract surgery: precise capsulotomy formation, clear corneal and limbal relaxing incision construction, lens fragmentation, and lens softening. Evidence suggests that refractive benefits include more precise effective lens position as well as reduced effective phacoemulsification time with the use of FSL compared with manual surgery. Patients with conditions such as Fuchs' endothelial dystrophy, pseudoexfoliation, history of trauma, or brunescent cataracts may particularly benefit from this technology. There are significant financial and logistical issues to consider prior to the purchase of a FSL, including the cost of the laser, and charges to patients, and how the laser affects the patient flow in the operating room. The FSL may significantly change the current approach to cataract surgery.
Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A
2016-01-01
NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.
Hernández, Moisés; Guerrero, Ginés D.; Cecilia, José M.; García, José M.; Inuggi, Alberto; Jbabdi, Saad; Behrens, Timothy E. J.; Sotiropoulos, Stamatios N.
2013-01-01
With the performance of central processing units (CPUs) having effectively reached a limit, parallel processing offers an alternative for applications with high computational demands. Modern graphics processing units (GPUs) are massively parallel processors that can execute simultaneously thousands of light-weight processes. In this study, we propose and implement a parallel GPU-based design of a popular method that is used for the analysis of brain magnetic resonance imaging (MRI). More specifically, we are concerned with a model-based approach for extracting tissue structural information from diffusion-weighted (DW) MRI data. DW-MRI offers, through tractography approaches, the only way to study brain structural connectivity, non-invasively and in-vivo. We parallelise the Bayesian inference framework for the ball & stick model, as it is implemented in the tractography toolbox of the popular FSL software package (University of Oxford). For our implementation, we utilise the Compute Unified Device Architecture (CUDA) programming model. We show that the parameter estimation, performed through Markov Chain Monte Carlo (MCMC), is accelerated by at least two orders of magnitude, when comparing a single GPU with the respective sequential single-core CPU version. We also illustrate similar speed-up factors (up to 120x) when comparing a multi-GPU with a multi-CPU implementation. PMID:23658616
Accurate GM atrophy quantification in MS using lesion-filling with co-registered 2D lesion masks☆
Popescu, V.; Ran, N.C.G.; Barkhof, F.; Chard, D.T.; Wheeler-Kingshott, C.A.; Vrenken, H.
2014-01-01
Background In multiple sclerosis (MS), brain atrophy quantification is affected by white matter lesions. LEAP and FSL-lesion_filling, replace lesion voxels with white matter intensities; however, they require precise lesion identification on 3DT1-images. Aim To determine whether 2DT2 lesion masks co-registered to 3DT1 images, yield grey and white matter volumes comparable to precise lesion masks. Methods 2DT2 lesion masks were linearly co-registered to 20 3DT1-images of MS patients, with nearest-neighbor (NNI), and tri-linear interpolation. As gold-standard, lesion masks were manually outlined on 3DT1-images. LEAP and FSL-lesion_filling were applied with each lesion mask. Grey (GM) and white matter (WM) volumes were quantified with FSL-FAST, and deep gray matter (DGM) volumes using FSL-FIRST. Volumes were compared between lesion mask types using paired Wilcoxon tests. Results Lesion-filling with gold-standard lesion masks compared to native images reduced GM overestimation by 1.93 mL (p < .001) for LEAP, and 1.21 mL (p = .002) for FSL-lesion_filling. Similar effects were achieved with NNI lesion masks from 2DT2. Global WM underestimation was not significantly influenced. GM and WM volumes from NNI, did not differ significantly from gold-standard. GM segmentation differed between lesion masks in the lesion area, and also elsewhere. Using the gold-standard, FSL-FAST quantified as GM on average 0.4% of the lesion area with LEAP and 24.5% with FSL-lesion_filling. Lesion-filling did not influence DGM volumes from FSL-FIRST. Discussion These results demonstrate that for global GM volumetry, precise lesion masks on 3DT1 images can be replaced by co-registered 2DT2 lesion masks. This makes lesion-filling a feasible method for GM atrophy measurements in MS. PMID:24567908
2007-09-01
Distichlis spicata (saltgrass), Glycyrrhiza lepidota (wild licorice), Jun- cus balticus (Baltic rush), Sida leprosa (alkali mallow), and Sporobolus...E 4138005 N N under vegetation Glycyrrhiza lepidota licorice FSL201 N-3 367579 E 4138377 N N under shrub Atriplex canescens fourwing...Big sagebrush) Dryland nonalkaline scrub FSL201 N2 Qa Glycyrrhiza lepidota (Wild licorice) High-ground-water alkaline meadow FSL201 N3 Qa
NLS Flight Simulation Laboratory (FSL) documentation
NASA Technical Reports Server (NTRS)
1995-01-01
The Flight Simulation Laboratory (FSL) Electronic Documentation System design consists of modification and utilization of the MSFC Integrated Engineering System (IES), translation of the existing FSL documentation to an electronic format, and generation of new drawings to represent the Engine Flight Simulation Laboratory design and implementation. The intent of the electronic documentation is to provide ease of access, local print/plot capabilities, as well as the ability to correct and/or modify the stored data by network users who are authorized to access this information.
Elevation of Il6 is associated with disturbed let-7 biogenesis in a genetic model of depression
Wei, Y B; Liu, J J; Villaescusa, J C; Åberg, E; Brené, S; Wegener, G; Mathé, A A; Lavebratt, C
2016-01-01
Elevation of the proinflammatory cytokine IL-6 has been implicated in depression; however, the mechanisms remain elusive. MicroRNAs (miRNAs) are small non-coding RNAs that inhibit gene expression post-transcriptionally. The lethal-7 (let-7) miRNA family was suggested to be involved in the inflammation process and IL-6 was shown to be one of its targets. In the present study, we report elevation of Il6 in the prefrontal cortex (PFC) of a genetic rat model of depression, the Flinders Sensitive Line (FSL) compared to the control Flinders Resistant Line. This elevation was associated with an overexpression of LIN28B and downregulation of let-7 miRNAs, the former an RNA-binding protein that selectively represses let-7 synthesis. Also DROSHA, a key enzyme in miRNA biogenesis was downregulated in FSL. Running was previously shown to have an antidepressant-like effect in the FSL rat. We found that running reduced Il6 levels and selectively increased let-7i and miR-98 expression in the PFC of FSL, although there were no differences in LIN28B and DROSHA expression. Pri-let-7i was upregulated in the running FSL group, which associated with increased histone H4 acetylation. In conclusion, the disturbance of let-7 family biogenesis may underlie increased proinflammatory markers in the depressed FSL rats while physical activity could reduce their expression, possibly through regulating primary miRNA expression via epigenetic mechanisms. PMID:27529677
Elevation of Il6 is associated with disturbed let-7 biogenesis in a genetic model of depression.
Wei, Y B; Liu, J J; Villaescusa, J C; Åberg, E; Brené, S; Wegener, G; Mathé, A A; Lavebratt, C
2016-08-16
Elevation of the proinflammatory cytokine IL-6 has been implicated in depression; however, the mechanisms remain elusive. MicroRNAs (miRNAs) are small non-coding RNAs that inhibit gene expression post-transcriptionally. The lethal-7 (let-7) miRNA family was suggested to be involved in the inflammation process and IL-6 was shown to be one of its targets. In the present study, we report elevation of Il6 in the prefrontal cortex (PFC) of a genetic rat model of depression, the Flinders Sensitive Line (FSL) compared to the control Flinders Resistant Line. This elevation was associated with an overexpression of LIN28B and downregulation of let-7 miRNAs, the former an RNA-binding protein that selectively represses let-7 synthesis. Also DROSHA, a key enzyme in miRNA biogenesis was downregulated in FSL. Running was previously shown to have an antidepressant-like effect in the FSL rat. We found that running reduced Il6 levels and selectively increased let-7i and miR-98 expression in the PFC of FSL, although there were no differences in LIN28B and DROSHA expression. Pri-let-7i was upregulated in the running FSL group, which associated with increased histone H4 acetylation. In conclusion, the disturbance of let-7 family biogenesis may underlie increased proinflammatory markers in the depressed FSL rats while physical activity could reduce their expression, possibly through regulating primary miRNA expression via epigenetic mechanisms.
Kanemaru, Kazuya; Nishi, Kyoko; Diksic, Mirko
2009-01-01
The neurotransmitter, serotonin, is involved in several brain functions, including both normal, physiological functions, and pathophysiological functions. Alterations in any of the normal parameters of serotonergic neurotransmission can produce several different psychiatric disorders, including major depression. In many instances, brain neurochemical variables are not able to be studied properly in humans, thus making the use of good animal models extremely valuable. One of these animal models is the Flinders Sensitive Line (FSL) of rats, which has face, predictive and constructive validities in relation to human depression. The objective of this study was to quantify the effect of the tryptophan hydroxylase (TPH) activation inhibitor, AGN-2979, on the FSL rats (rats with depression-like behaviour), and compare it to the effect on the Flinders Resistant Line (FRL) of rats used as the control rats. The effect was evaluated by measuring changes in regional serotonin synthesis in the vehicle treated rats (FSL-VEH and FRL-VEH) relative to those measured in the AGN-2979 treated rats (FSL-AGN and FRL-AGN). Regional serotonin synthesis was measured autoradiographically in more than thirty brain regions. The measurements were performed using α-[14C]methyl-L-tryptophan as the tracer. The results indicate that AGN-2979 did not produce a significant reduction of TPH activity in the AGN-2979 group relative to the vehicle group (a reduction would have been observed if there had been an activation of TPH by the experimental set up) in the FSL rats. On the other hand, there was a highly significant reduction of synthesis in the FRL rats treated by AGN-2979, relative to the vehicle group. Together, the results demonstrate that in the FSL rats, AGN-2979 does not affect serotonin synthesis. This suggests that there was no activation of TPH in the FSL rats during the experimental procedure, but such activation did occur in the FRL rats. Because of this finding, it could be hypothesised that TPH in the FSL rats cannot be easily activated. This may contribute to the development of depressive-like symptoms in the FSL rats (“depressed” rats), as they cannot easily modulate their need for elevated amounts of this neurotransmitter, and possibly other neurotransmitters. Further, because these rats represent a very good model of human depression, one can hypothesize that humans who do not have readily activated TPH may be more prone to develop depression. PMID:19463878
Weller, Daniel; Andrus, Alexis; Wiedmann, Martin; den Bakker, Henk C
2015-01-01
Sampling of seafood and dairy processing facilities in the north-eastern USA produced 18 isolates of Listeria spp. that could not be identified at the species-level using traditional phenotypic and genotypic identification methods. Results of phenotypic and genotypic analyses suggested that the isolates represent two novel species with an average nucleotide blast identity of less than 92% with previously described species of the genus Listeria. Phylogenetic analyses based on whole genome sequences, 16S rRNA gene and sigB gene sequences confirmed that the isolates represented by type strain FSL M6-0635(T) and FSL A5-0209 cluster phylogenetically with Listeria cornellensis. Phylogenetic analyses also showed that the isolates represented by type strain FSL A5-0281(T) cluster phylogenetically with Listeria riparia. The name Listeria booriae sp. nov. is proposed for the species represented by type strain FSL A5-0281(T) ( =DSM 28860(T) =LMG 28311(T)), and the name Listeria newyorkensis sp. nov. is proposed for the species represented by type strain FSL M6-0635(T) ( =DSM 28861(T) =LMG 28310(T)). Phenotypic and genotypic analyses suggest that neither species is pathogenic. © 2015 IUMS.
A Scalable Framework For Segmenting Magnetic Resonance Images
Hore, Prodip; Goldgof, Dmitry B.; Gu, Yuhua; Maudsley, Andrew A.; Darkazanli, Ammar
2009-01-01
A fast, accurate and fully automatic method of segmenting magnetic resonance images of the human brain is introduced. The approach scales well allowing fast segmentations of fine resolution images. The approach is based on modifications of the soft clustering algorithm, fuzzy c-means, that enable it to scale to large data sets. Two types of modifications to create incremental versions of fuzzy c-means are discussed. They are much faster when compared to fuzzy c-means for medium to extremely large data sets because they work on successive subsets of the data. They are comparable in quality to application of fuzzy c-means to all of the data. The clustering algorithms coupled with inhomogeneity correction and smoothing are used to create a framework for automatically segmenting magnetic resonance images of the human brain. The framework is applied to a set of normal human brain volumes acquired from different magnetic resonance scanners using different head coils, acquisition parameters and field strengths. Results are compared to those from two widely used magnetic resonance image segmentation programs, Statistical Parametric Mapping and the FMRIB Software Library (FSL). The results are comparable to FSL while providing significant speed-up and better scalability to larger volumes of data. PMID:20046893
Charleer, Sara; Mathieu, Chantal; Nobels, Frank; Gillard, Pieter
2018-06-01
Nowadays, most Belgian patients with type 1 diabetes use flash glucose monitoring (FreeStyle Libre [FSL]; Abbott Diabetes Care, Alameda, California) to check their glucose values, but some patients find the sensor on the upper arm too visible. The aim of the present study was to compare the accuracy and precision of FSL sensors when placed on different sites. A total of 23 adults with type 1 diabetes used three FSL sensors simultaneously for 14 days on the upper arm, abdomen and upper thigh. FSL measurements were compared with capillary blood glucose (BG) measurements obtained with a built-in FSL BG meter. The aggregated mean absolute relative difference was 11.8 ± 12.0%, 18.5 ± 18.4% and 12.3 ± 13.8% for the arm, abdomen (P = .002 vs arm) and thigh (P = .5 vs arm), respectively. Results of Clarke error grid analysis for the arm and thigh were similar (zone A: 84.9% vs 84.5%; P = .6), while less accuracy was seen for the abdomen (zone A: 69.4%; P = .01). Apart from the first day, the accuracy of FSL sensors on the arm and thigh was more stable across the 14-day wear duration than accuracy of sensors on the abdomen, which deteriorated mainly during week 2 (P < .0005). The aggregated precision absolute relative difference was markedly lower for the arm/thigh (10.9 ± 11.9%) compared with the arm/abdomen (20.9 ± 22.8%; P = .002). Our results indicate that the accuracy and precision of FSL sensors placed on the upper thigh are similar to the upper arm, whereas the abdomen performed unacceptably poorly. © 2018 John Wiley & Sons Ltd.
Wegener, Gregers; Finger, Beate C; Elfving, Betina; Keller, Kirsten; Liebenberg, Nico; Fischer, Christina W; Singewald, Nicolas; Slattery, David A; Neumann, Inga D; Mathé, Aleksander A
2012-04-01
Neuropeptide S (NPS) and its receptor (NPSR) have been implicated in the mediation of anxiolytic-like behaviour in rodents. However, little knowledge is available regarding the NPS system in depression-related behaviours, and whether NPS also exerts anxiolytic effects in an animal model of psychopathology. Therefore, the aim of this work was to characterize the effects of NPS on depression- and anxiety-related parameters, using male and female rats in a well-validated animal model of depression: the Flinders Sensitive Line (FSL), their controls, the Flinders Resistant Line (FRL), and Sprague-Dawley (SD) rats. We found that FSL showed greater immobility in the forced swim test (FST) than FRL, confirming their phenotype. However, NPS did not affect depression-related behaviour in any rat line. No significant differences in baseline anxiety levels between the FSL and FRL strains were observed, but FSL and FRL rats displayed less anxiety-like behaviour compared to SD rats. NPS decreased anxiety-like behaviour on the elevated plus-maze in all strains. The expression of the NPSR in the amygdala, periventricular hypothalamic nucleus, and hippocampus was equal in all male strains, although a trend towards reduced expression within the amygdala was observed in FSL rats compared to SD rats. In conclusion, NPS had a marked anxiolytic effect in FSL, FRL and SD rats, but did not modify the depression-related behaviour in any strain, in spite of the significant differences in innate level between the strains. These findings suggest that NPS specifically modifies anxiety behaviour but cannot overcome/reverse a genetically mediated depression phenotype.
A practical guideline for intracranial volume estimation in patients with Alzheimer's disease
2015-01-01
Background Intracranial volume (ICV) is an important normalization measure used in morphometric analyses to correct for head size in studies of Alzheimer Disease (AD). Inaccurate ICV estimation could introduce bias in the outcome. The current study provides a decision aid in defining protocols for ICV estimation in patients with Alzheimer disease in terms of sampling frequencies that can be optimally used on the volumetric MRI data, and the type of software most suitable for use in estimating the ICV measure. Methods Two groups of 22 subjects are considered, including adult controls (AC) and patients with Alzheimer Disease (AD). Reference measurements were calculated for each subject by manually tracing intracranial cavity by the means of visual inspection. The reliability of reference measurements were assured through intra- and inter- variation analyses. Three publicly well-known software packages (Freesurfer, FSL, and SPM) were examined in their ability to automatically estimate ICV across the groups. Results Analysis of the results supported the significant effect of estimation method, gender, cognitive condition of the subject and the interaction among method and cognitive condition factors in the measured ICV. Results on sub-sampling studies with a 95% confidence showed that in order to keep the accuracy of the interleaved slice sampling protocol above 99%, the sampling period cannot exceed 20 millimeters for AC and 15 millimeters for AD. Freesurfer showed promising estimates for both adult groups. However SPM showed more consistency in its ICV estimation over the different phases of the study. Conclusions This study emphasized the importance in selecting the appropriate protocol, the choice of the sampling period in the manual estimation of ICV and selection of suitable software for the automated estimation of ICV. The current study serves as an initial framework for establishing an appropriate protocol in both manual and automatic ICV estimations with different subject populations. PMID:25953026
Kurkjian, Cathryn J; Guo, Hao; Montgomery, Nathan D; Cheng, Ning; Yuan, Hong; Merrill, Joseph R; Sempowski, Gregory D; Brickey, W June; Ting, Jenny P-Y
2017-12-11
Risks of radiation exposure from nuclear incidents and cancer radiotherapy are undeniable realities. These dangers urgently compel the development of agents for ameliorating radiation-induced injuries. Biologic pathways mediated by myeloid differentiation primary response gene 88 (MyD88), the common adaptor for toll-like receptor (TLR) and Interleukin-1 receptor signaling, are critical for radioprotection. Treating with agonists prior to radiation enhances survival by activating TLR signaling, whereas radiomitigating TLR-activating therapeutics given after exposure are less defined. We examine the radiomitigation capability of TLR agonists and identify one that is superior for its efficacy and reduced toxic consequences compared to other tested agonists. We demonstrate that the synthetic TLR2/6 ligand Fibroblast-stimulating lipopeptide (FSL-1) substantially prolongs survival in both male and female mice when administered 24 hours after radiation and shows MyD88-dependent function. FSL-1 treatment results in accelerated hematopoiesis in bone marrow, spleen and periphery, and augments systemic levels of hematopoiesis-stimulating factors. The ability of FSL-1 to stimulate hematopoiesis is critical, as hematopoietic dysfunction results from a range of ionizing radiation doses. The efficacy of a single FSL-1 dose for alleviating radiation injury while protecting against adverse effects reveals a viable radiation countermeasures agent.
2013-06-19
ISS036-E-009550 (19 June 2013) --- European Space Agency astronaut Luca Parmitano, Expedition 36 flight engineer, installs the Fundamental and Applied Studies of Emulsion Stability (FASES) experiment container into the Central Experiment Module (CEM) Lower of Fluid Science Laboratory (FSL) in the Columbus laboratory of the International Space Station.
2013-01-01
Femtosecond lasers (FSL) are playing an increasingly important role in materials research, characterization, and modification. Due to an extremely short pulse width, interactions of FSL irradiation with solid surfaces attract special interest, and a number of unusual phenomena resulted in the formation of new materials are expected. Here, we report on a new nanostructure observed after the interaction of FSL irradiation with arrays of vertically aligned carbon nanotubes (CNTs) intercalated with iron phase catalyst nanoparticles. It was revealed that the FSL laser ablation transforms the topmost layer of CNT array into iron phase nanospheres (40 to 680 nm in diameter) located at the tip of the CNT bundles of conical shape. Besides, the smaller nanospheres (10 to 30 nm in diameter) are found to be beaded at the sides of these bundles. Some of the larger nanospheres are encapsulated into carbon shells, which sometime are found to contain CNTs. The mechanism of creation of such nanostructures is proposed. PMID:24004518
Labunov, Vladimir; Prudnikava, Alena; Bushuk, Serguei; Filatov, Serguei; Shulitski, Boris; Tay, Beng Kang; Shaman, Yury; Basaev, Alexander
2013-09-03
Femtosecond lasers (FSL) are playing an increasingly important role in materials research, characterization, and modification. Due to an extremely short pulse width, interactions of FSL irradiation with solid surfaces attract special interest, and a number of unusual phenomena resulted in the formation of new materials are expected. Here, we report on a new nanostructure observed after the interaction of FSL irradiation with arrays of vertically aligned carbon nanotubes (CNTs) intercalated with iron phase catalyst nanoparticles. It was revealed that the FSL laser ablation transforms the topmost layer of CNT array into iron phase nanospheres (40 to 680 nm in diameter) located at the tip of the CNT bundles of conical shape. Besides, the smaller nanospheres (10 to 30 nm in diameter) are found to be beaded at the sides of these bundles. Some of the larger nanospheres are encapsulated into carbon shells, which sometime are found to contain CNTs. The mechanism of creation of such nanostructures is proposed.
Navigating Native-Speaker Ideologies as FSL Teacher
ERIC Educational Resources Information Center
Wernicke, Meike
2017-01-01
Although a well-established domain of research in English language teaching, native-speaker ideologies have received little attention in French language education. This article reports on a study that examined the salience of "authentic French" in the identity construction of French as a second language (FSL) teachers in English-speaking…
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
Packaging Software Assets for Reuse
NASA Astrophysics Data System (ADS)
Mattmann, C. A.; Marshall, J. J.; Downs, R. R.
2010-12-01
The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.
Online Synchronous Communication in the Second-Language Classroom
ERIC Educational Resources Information Center
Murphy, Elizabeth
2009-01-01
The study reported on in this paper used a framework of benefits, challenges and solutions to categorize data from a design experiment using synchronous online communication for learning French as a second language (FSL). Participants were 92 Grade 6, FSL students and four teachers from urban and rural areas of Newfoundland, Canada. Data…
Importance of Muscle Power Variables in Repeated and Single Sprint Performance in Soccer Players
López-Segovia, Manuel; Dellal, Alexandre; Chamari, Karim; González-Badillo, Juan José
2014-01-01
This study examined the relationship between lower body power and repeated as well as single sprint performance in soccer players. The performance of nineteen male soccer players was examined. The first testing session included the countermovement jump (CMJL) and the progressive full squat (FSL), both with external loads. Power in the CMJL and FSL was measured with each load that was lifted. The second session included a protocol of 40-m repeated sprints with a long recovery period (2 min). The number of sprints executed until there was a 3% decrease in performance for the best 40-m sprint time was recorded as a repeated sprint index (RSI). The RSI was moderately associated with power output relative to body mass in the CMJL and FSL (r = 0.53/0.54, p ≤ 0.05). The most and least powerful players (determined by FSL) showed significant differences in the RSI (9.1 ± 4.2 vs. 6.5 ± 1.6) and 10 m sprint time (p ± 0.01). Repeated and single sprints are associated with relatively lower body power in soccer players. PMID:25031688
First-spike latency in Hodgkin's three classes of neurons.
Wang, Hengtong; Chen, Yueling; Chen, Yong
2013-07-07
We study the first-spike latency (FSL) in Hodgkin's three classes of neurons with the Morris-Lecar neuron model. It is found that all the three classes of neurons can encode an external stimulus into FSLs. With DC inputs, the FSLs of all of the neurons decrease with input intensity. With input current decreased to the threshold, class 1 neurons show an arbitrary long FSL whereas class 2 and 3 neurons exhibit the short-limit FSLs. When the input current is sinusoidal, the amplitude, frequency and initial phase can be encoded by all the three classes of neurons. The FSLs of all of the neurons decrease with the input amplitude and frequency. When the input frequency is too high, all of the neurons respond with infinite FSLs. When the initial phase increases, the FSL decreases and then jumps to a maximal value and finally decreases linearly. With changes in the input parameters, the FSLs of the class 1 and 2 neurons exhibit similar properties. However, the FSL of the class 3 neurons became slightly longer and only produces responses for a narrow range of initial phase if input frequencies are low. Moreover, our results also show that the FSL and firing rate responses are mutually independent processes and that neurons can encode an external stimulus into different FSLs and firing rates simultaneously. This finding is consistent with the current theory of dual or multiple complementary coding mechanisms. Copyright © 2013 Elsevier Ltd. All rights reserved.
The antidepressant effect of running is associated with increased hippocampal cell proliferation.
Bjørnebekk, Astrid; Mathé, Aleksander A; Brené, Stefan
2005-09-01
A common trait of antidepressant drugs, electroconvulsive treatment and physical exercise is that they relieve depression and up-regulate neurotrophic factors as well as cell proliferation and neurogenesis in the hippocampus. In order to identify possible biological underpinnings of depression and the antidepressant effect of running, we analysed cell proliferation, the level of the neurotrophic factor BDNF in hippocampus and dynorphin in striatum/accumbens in 'depressed' Flinders Sensitive Line rats (FSL) and Flinders Resistant Line (FRL) rats with and without access to running-wheels. The FRL strain exhibited a higher daily running activity than the FSL strain. Wheel-running had an antidepressant effect in the 'depressed' FSL rats, as indicated by the forced swim test. In the hippocampus, cell proliferation was lower in the 'depressed' rats compared to the control FRL rats but there was no difference in BDNF or dynorphin levels in striatum/accumbens. After 5 wk of running, cell proliferation increased in FSL but not in FRL rats. BDNF and dynorphin mRNA levels were increased in FRL but not to the same extent in the in FSL rats; thus, increased BDNF and dynorphin levels were correlated to the running activity but not to the antidepressant effect of running. The only parameter that was associated to basal level of 'depression' and to the antidepressant effect was cell proliferation in the hippocampus. Thus, suppression of cell proliferation in the hippocampus could constitute one of the mechanisms that underlie depression, and physical activity might be an efficient antidepressant.
Femtosecond laser cutting of multiple thin corneal stromal lamellae for endothelial bioengineering.
Bernard, Aurélien; He, Zhiguo; Forest, Fabien; Gauthier, Anne-Sophie; Peocʼh, Michel; Dumollard, Jean-Marc; Acquart, Sophie; Montard, Romain; Delbosc, Bernard; Gain, Philippe; Thuret, Gilles
2015-02-01
To assess the feasibility of cutting multiple thin stromal lamellae in human donor corneas using a commercial femtosecond laser (FSL) to provide cell carriers for future endothelial graft bioengineering. Eight edematous organ-cultured corneas not suitable for grafting for endothelial reasons were mounted on a Ziemer anterior chamber and cut with a Z6 FSL with 6 successive parallel cuts, from depth to surface. Target thickness of each lamella ranged from 100 to 150 μm depending on initial corneal thickness. Thickness was measured using anterior segment optical coherence tomography before and after cutting on mounted corneas, and on each stromal lamella after detachment. Scanning electron microscopy observation was performed on 4 lamellae and histological cross sections on 1 cornea before detachment. A median of 5 (minimum 3, maximum 7) lamellae was obtained per cornea. All lamellae still attached were the most posterior ones, suggesting that FSL was less efficient because of light scattering by edematous stroma. Cut precision and postdetachment swelling were correlated with anterior-posterior position within the cornea. Median lamella thickness was 127 μm (56-222 μm) before detachment and 196 μm (80-304 μm) after detachment. Surface state was consistent with previously reported FSL lamellar cuts during Descemet stripping automated endothelial keratoplasty. Up to 7 thin lamellae can be cut in stored corneas with an FSL. This method, once optimized primarily by using deswelled, more transparent corneas, could prove effective for recycling unsuitable donor corneas in corneal bioengineering processes.
Miller, Rachel A.; Beno, Sarah M.; Kent, David J.; Carroll, Laura M.; Martin, Nicole H.; Boor, Kathryn J.
2016-01-01
A facultatively anaerobic, spore-forming Bacillus strain, FSL W8-0169T, collected from raw milk stored in a silo at a dairy powder processing plant in the north-eastern USA was initially identified as a Bacillus cereus group species based on a partial sequence of the rpoB gene and 16S rRNA gene sequence. Analysis of core genome single nucleotide polymorphisms clustered this strain separately from known B. cereus group species. Pairwise average nucleotide identity blast values obtained for FSL W8-0169T compared to the type strains of existing B. cereus group species were <95 % and predicted DNA–DNA hybridization values were <70 %, suggesting that this strain represents a novel B. cereus group species. We characterized 10 additional strains with the same or closely related rpoB allelic type, by whole genome sequencing and phenotypic analyses. Phenotypic characterization identified a higher content of iso-C16 : 0 fatty acid and the combined inability to ferment sucrose or to hydrolyse arginine as the key characteristics differentiating FSL W8-0169T from other B. cereus group species. FSL W8-0169T is psychrotolerant, produces haemolysin BL and non-haemolytic enterotoxin, and is cytotoxic in a HeLa cell model. The name Bacillus wiedmannii sp. nov. is proposed for the novel species represented by the type strain FSL W8-0169T (=DSM 102050T=LMG 29269T). PMID:27520992
Canputer Science and Technology: Introduction to Software Packages
1984-04-01
Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors
Mouton, Moné; Harvey, Brian H; Cockeran, Marike; Brink, Christiaan B
2016-02-01
Methamphetamine (METH) is a psychostimulant and drug of abuse, commonly used early in life, including in childhood and adolescence. Adverse effects include psychosis, anxiety and mood disorders, as well as increased risk of developing a mental disorder later in life. The current study investigated the long-term effects of chronic METH exposure during pre-adolescence in stress-sensitive Flinders Sensitive Line (FSL) rats (genetic model of depression) and control Flinders Resistant Line (FRL) rats. METH or vehicle control was administered twice daily from post-natal day 19 (PostND19) to PostND34, followed by behavioural testing at either PostND35 (early effects) or long-lasting after withdrawal at PostND60 (early adulthood). Animals were evaluated for depressive-like behaviour, locomotor activity, social interaction and object recognition memory. METH reduced depressive-like behaviour in both FSL and FRL rats at PostND35, but enhanced this behaviour at PostND60. METH also reduced locomotor activity on PostND35 in both FSL and FRL rats, but without effect at PostND60. Furthermore, METH significantly lowered social interaction behaviour (staying together) in both FRL and FSL rats at PostND35 and PostND60, whereas self-grooming time was significantly reduced only at PostND35. METH treatment enhanced exploration of the familiar vs. novel object in the novel object recognition test (nORT) in FSL and FRL rats on PostND35 and PostND60, indicative of reduced cognitive performance. Thus, early-life METH exposure induce social and cognitive deficits. Lastly, early-life exposure to METH may result in acute antidepressant-like effects immediately after chronic exposure, whereas long-term effects after withdrawal are depressogenic. Data also supports a role for genetic predisposition as with FSL rats.
de Bresser, Jeroen; Hendrikse, Jeroen; Siero, Jeroen C. W.; Petersen, Esben T.; De Vis, Jill B.
2018-01-01
Objective In previous work we have developed a fast sequence that focusses on cerebrospinal fluid (CSF) based on the long T2 of CSF. By processing the data obtained with this CSF MRI sequence, brain parenchymal volume (BPV) and intracranial volume (ICV) can be automatically obtained. The aim of this study was to assess the precision of the BPV and ICV measurements of the CSF MRI sequence and to validate the CSF MRI sequence by comparison with 3D T1-based brain segmentation methods. Materials and methods Ten healthy volunteers (2 females; median age 28 years) were scanned (3T MRI) twice with repositioning in between. The scan protocol consisted of a low resolution (LR) CSF sequence (0:57min), a high resolution (HR) CSF sequence (3:21min) and a 3D T1-weighted sequence (6:47min). Data of the HR 3D-T1-weighted images were downsampled to obtain LR T1-weighted images (reconstructed imaging time: 1:59 min). Data of the CSF MRI sequences was automatically segmented using in-house software. The 3D T1-weighted images were segmented using FSL (5.0), SPM12 and FreeSurfer (5.3.0). Results The mean absolute differences for BPV and ICV between the first and second scan for CSF LR (BPV/ICV: 12±9/7±4cc) and CSF HR (5±5/4±2cc) were comparable to FSL HR (9±11/19±23cc), FSL LR (7±4, 6±5cc), FreeSurfer HR (5±3/14±8cc), FreeSurfer LR (9±8, 12±10cc), and SPM HR (5±3/4±7cc), and SPM LR (5±4, 5±3cc). The correlation between the measured volumes of the CSF sequences and that measured by FSL, FreeSurfer and SPM HR and LR was very good (all Pearson’s correlation coefficients >0.83, R2 .67–.97). The results from the downsampled data and the high-resolution data were similar. Conclusion Both CSF MRI sequences have a precision comparable to, and a very good correlation with established 3D T1-based automated segmentations methods for the segmentation of BPV and ICV. However, the short imaging time of the fast CSF MRI sequence is superior to the 3D T1 sequence on which segmentation with established methods is performed. PMID:29672584
van der Kleij, Lisa A; de Bresser, Jeroen; Hendrikse, Jeroen; Siero, Jeroen C W; Petersen, Esben T; De Vis, Jill B
2018-01-01
In previous work we have developed a fast sequence that focusses on cerebrospinal fluid (CSF) based on the long T2 of CSF. By processing the data obtained with this CSF MRI sequence, brain parenchymal volume (BPV) and intracranial volume (ICV) can be automatically obtained. The aim of this study was to assess the precision of the BPV and ICV measurements of the CSF MRI sequence and to validate the CSF MRI sequence by comparison with 3D T1-based brain segmentation methods. Ten healthy volunteers (2 females; median age 28 years) were scanned (3T MRI) twice with repositioning in between. The scan protocol consisted of a low resolution (LR) CSF sequence (0:57min), a high resolution (HR) CSF sequence (3:21min) and a 3D T1-weighted sequence (6:47min). Data of the HR 3D-T1-weighted images were downsampled to obtain LR T1-weighted images (reconstructed imaging time: 1:59 min). Data of the CSF MRI sequences was automatically segmented using in-house software. The 3D T1-weighted images were segmented using FSL (5.0), SPM12 and FreeSurfer (5.3.0). The mean absolute differences for BPV and ICV between the first and second scan for CSF LR (BPV/ICV: 12±9/7±4cc) and CSF HR (5±5/4±2cc) were comparable to FSL HR (9±11/19±23cc), FSL LR (7±4, 6±5cc), FreeSurfer HR (5±3/14±8cc), FreeSurfer LR (9±8, 12±10cc), and SPM HR (5±3/4±7cc), and SPM LR (5±4, 5±3cc). The correlation between the measured volumes of the CSF sequences and that measured by FSL, FreeSurfer and SPM HR and LR was very good (all Pearson's correlation coefficients >0.83, R2 .67-.97). The results from the downsampled data and the high-resolution data were similar. Both CSF MRI sequences have a precision comparable to, and a very good correlation with established 3D T1-based automated segmentations methods for the segmentation of BPV and ICV. However, the short imaging time of the fast CSF MRI sequence is superior to the 3D T1 sequence on which segmentation with established methods is performed.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
Can an Interactive Digital Game Help French Learners Improve Their Pronunciation?
ERIC Educational Resources Information Center
Cardoso, Walcir; Rueb, Avery; Grimshaw, Jennica
2017-01-01
This study examines the effects of the pedagogical use of an interactive mobile digital game, Prêt à Négocier (PàN), on improving learners' pronunciation of French as a Second Language (FSL), using three holistic measures: comprehensibility, fluency, and overall pronunciation. Two groups of FSL learners engaged in different types of game-playing…
Study Abroad as Professional Development for FSL Teachers
ERIC Educational Resources Information Center
Wernicke, Meike
2010-01-01
In July 2009, a group of over 80 FSL teachers from British Columbia (BC) participated in a two-week sojourn at the "Centre d'Approches vivantes des Langues et des Medias" (CAVILAM) in Vichy, France, as part of an initiative to address the critical shortage of qualified French language teachers in the province. After almost four decades…
Hierarchies of Authenticity in Study Abroad: French from Canada versus French from France?
ERIC Educational Resources Information Center
Wernicke, Meike
2016-01-01
For many decades, Francophone regions in Canada have provided language study exchanges for French as a second language (FSL) learners within their own country. At the same time, FSL students and teachers in Canada continue to orient to a native speaker standard associated with European French. This Eurocentric orientation manifested itself in a…
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane
2017-08-01
The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All software packages underestimated the upper airway dimensions of the anthropomorphic phantom.
Neurological Effects of Exposure to Non-Hypoxic Hypobaria
2014-04-16
be at risk for subclinical brain injury, raising concern about the long-term impact in aircrew. Altitude chamber personnel are a second...flight surgeon FSL BET brain extraction tool FSL FLIRT FMRIB’s linear image registration tool IQ intelligence quotient IRB Institutional Review...population would potentially have similar risks and findings. Chronic brain injury in other neurological diseases is associated with lower
Gómez-Galán, Marta; Femenía, Teresa; Åberg, Elin; Graae, Lisette; Van Eeckhaut, Ann; Smolders, Ilse; Brené, Stefan; Lindskog, Maria
2016-01-01
Stress, such as social isolation, is a well-known risk factor for depression, most probably in combination with predisposing genetic factors. Physical exercise on the other hand, is depicted as a wonder-treatment that makes you healthier, happier and live longer. However, the published results on the effects of exercise are ambiguous, especially when it comes to neuropsychiatric disorders. Here we combine a paradigm of social isolation with a genetic rat model of depression, the Flinders Sensitive Line (FSL), already known to have glutamatergic synaptic alterations. Compared to group-housed FSL rats, we found that social isolation further affects synaptic plasticity and increases basal synaptic transmission in hippocampal CA1 pyramidal neurons. These functional synaptic alterations co-exist with changes in hippocampal protein expression levels: social isolation in FSL rats reduce expression of the glial glutamate transporter GLT-1, and increase expression of the GluA2 AMPA-receptor subunit. We further show that physical exercise in form of voluntary running prevents the stress-induced synaptic effects but do not restore the endogenous mechanisms of depression already present in the FSL rat. PMID:27764188
Gómez-Galán, Marta; Femenía, Teresa; Åberg, Elin; Graae, Lisette; Van Eeckhaut, Ann; Smolders, Ilse; Brené, Stefan; Lindskog, Maria
2016-01-01
Stress, such as social isolation, is a well-known risk factor for depression, most probably in combination with predisposing genetic factors. Physical exercise on the other hand, is depicted as a wonder-treatment that makes you healthier, happier and live longer. However, the published results on the effects of exercise are ambiguous, especially when it comes to neuropsychiatric disorders. Here we combine a paradigm of social isolation with a genetic rat model of depression, the Flinders Sensitive Line (FSL), already known to have glutamatergic synaptic alterations. Compared to group-housed FSL rats, we found that social isolation further affects synaptic plasticity and increases basal synaptic transmission in hippocampal CA1 pyramidal neurons. These functional synaptic alterations co-exist with changes in hippocampal protein expression levels: social isolation in FSL rats reduce expression of the glial glutamate transporter GLT-1, and increase expression of the GluA2 AMPA-receptor subunit. We further show that physical exercise in form of voluntary running prevents the stress-induced synaptic effects but do not restore the endogenous mechanisms of depression already present in the FSL rat.
NASA Technical Reports Server (NTRS)
1981-01-01
The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.
ERIC Educational Resources Information Center
Faez, Farahnaz
2011-01-01
In this paper I examine similarities and differences between the required knowledge base of teachers of English as a second language (ESL) and French as a second language (FSL) for teaching in Kindergarten through Grade 12 programs in Canada. Drawing on knowledge base frameworks in language teacher education (Freeman and Johnson, 1998; Richards,…
ERIC Educational Resources Information Center
Naif, Ahmed H.; Saad, Noor Saazai Mat
2017-01-01
Adult Arab learners of Finnish as second language (FSL) often encounter communication difficulty when dealing with official documents. They also cannot help their children in their school homework. FSL proficiency is an essential requirement to get an employment and to obtain the Finnish citizenship. The aim of this paper is to explore the use of…
de Hoop, Bartjan; Gietema, Hester; van Ginneken, Bram; Zanen, Pieter; Groenewegen, Gerard; Prokop, Mathias
2009-04-01
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules >or=8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages.
Software and package applicating for network meta-analysis: A usage-based comparative study.
Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao
2017-12-21
To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, William Eugene
These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less
Bjørnebekk, Astrid; Mathé, Aleksander A; Brené, Stefan
2006-02-01
Physical activity has documented beneficial effect in treatment of depression. Recently, we found an antidepressant-like effect of running in an animal model of depression, the Flinders Sensitive Line (FSL) and demonstrated that it was associated with increased hippocampal cell proliferation. In this study, we analyzed levels of mRNAs encoding the neuropeptide Y (NPY) and the opioid peptides dynorphin and enkephalin in hippocampus and correlated these to cell proliferation in the FSL and in the 'nondepressed' Flinders Resistant Line (FRL) strain, with/without access to running wheels. Running increased NPY mRNA in dentate gyrus and the CA4 region in FSL, but not in FRL rats. NPY mRNA increase was correlated to increased cell proliferation in the subgranular zone of dentate gyrus. Baseline dynorphin and enkephalin mRNA levels in the dentate gyrus were lower in the FSL compared to the FRL strain. Running had no effect on dynorphin and enkephalin mRNAs in the FSL strain but it decreased dynorphin mRNA, and there was a trend to increased enkephalin mRNA in the FRL rats. Thus, it would appear that the CNS effects of running are different in 'depressed' and control animals; modification of NPY, a peptide associated with depression and anxiety, in depressed animals, vs effects on opioids, associated with the reward systems, in healthy controls. Our data support the hypothesis that NPY neurotransmission in hippocampus is malfunctioning in depression and that antidepressive treatment, in this case wheel running, will normalize it. In addition, we also show that the increased NPY after running is correlated to increased cell proliferation, which is associated with an antidepressive-like effect.
Cho, Kang-Woo; Yoon, Min-Hyuk; Song, Kyung-Guen; Ahn, Kyu-Hong
2011-01-01
The effects of antecedent dry days (ADD) on nitrogen removal efficiency were investigated in soil infiltration systems, with three distinguishable layers: mulch layer (ML), coarse soil layer (CSL) and fine soil layer (FSL). Two sets of lab-scale columns with loamy CSL (C1) and sandy CSL (C2) were dosed with synthetic run-off, carrying chemical oxygen demand of 100 mg L(-1) and total nitrogen of 13 mg L(-1). The intermittent dosing cycle was stepwise adjusted for 5, 10 and 20 days. The influent ammonium and organic nitrogen were adsorbed to the entire depth in C1, while dominantly to the FSL in C2. In both columns, the effluent ammonium concentration increased while the organic nitrogen concentration decreased, as ADD increased from 5 to 20 days. The effluent of C1 always showed nitrate concentration exceeding influent, caused by nitrification, by increasing amounts as ADD increased. However, the wash-out of nitrate in C1 was not distinct in terms of mass since the effluent flow rate was only 25% of the influent. In contrast, efficient reduction (>95%) of nitrate loading was observed in C2 under ADD of 5 and 10 days, because of insignificant nitrification in the CSL and denitrification in the FSL. However, for the ADD of 20 days, a significant nitrate wash-out appeared in C2 as well, possibly because of the re-aeration by the decreasing water content in the FSL. Consequently, the total nitrogen load escaping with the effluent was always smaller in C2, supporting the effectiveness of sandy CSL over loamy FSL for nitrogen removal under various ADDs.
Validation of thermal effects of LED package by using Elmer finite element simulation method
NASA Astrophysics Data System (ADS)
Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap
2017-02-01
The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.
NDE Software Developed at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Roth, Donald J.; Martin, Richard E.; Rauser, Richard W.; Nichols, Charles; Bonacuse, Peter J.
2014-01-01
NASA Glenn Research Center has developed several important Nondestructive Evaluation (NDE) related software packages for different projects in the last 10 years. Three of the software packages have been created with commercial-grade user interfaces and are available to United States entities for download on the NASA Technology Transfer and Partnership Office server (https://sr.grc.nasa.gov/). This article provides brief overviews of the software packages.
NLM microcomputer-based tutorials (for microcomputers). Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perkins, M.
1990-04-01
The package consists of TOXLEARN--a microcomputer-based training package for TOXLINE (Toxicology Information Online), CHEMLEARN-a microcomputer-based training package for CHEMLINE (Chemical Information Online), MEDTUTOR--a microcomputer-based training package for MEDLINE (Medical Information Online), and ELHILL LEARN--a microcomputer-based training package for the ELHILL search and retrieval software that supports the above-mentioned databases...Software Description: The programs were developed under PILOTplus using the NLM LEARN Programmer. They run on IBM-PC, XT, AT, PS/2, and fully compatible computers. The programs require 512K RAM memory, one disk drive, and DOS 2.0 or higher. The software supports most monochrome, color graphics, enhanced color graphics, or visual graphics displays.
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
Head motion during MRI acquisition reduces gray matter volume and thickness estimates.
Reuter, Martin; Tisdall, M Dylan; Qureshi, Abid; Buckner, Randy L; van der Kouwe, André J W; Fischl, Bruce
2015-02-15
Imaging biomarkers derived from magnetic resonance imaging (MRI) data are used to quantify normal development, disease, and the effects of disease-modifying therapies. However, motion during image acquisition introduces image artifacts that, in turn, affect derived markers. A systematic effect can be problematic since factors of interest like age, disease, and treatment are often correlated with both a structural change and the amount of head motion in the scanner, confounding the ability to distinguish biology from artifact. Here we evaluate the effect of head motion during image acquisition on morphometric estimates of structures in the human brain using several popular image analysis software packages (FreeSurfer 5.3, VBM8 SPM, and FSL Siena 5.0.7). Within-session repeated T1-weighted MRIs were collected on 12 healthy volunteers while performing different motion tasks, including two still scans. We show that volume and thickness estimates of the cortical gray matter are biased by head motion with an average apparent volume loss of roughly 0.7%/mm/min of subject motion. Effects vary across regions and remain significant after excluding scans that fail a rigorous quality check. In view of these results, the interpretation of reported morphometric effects of movement disorders or other conditions with increased motion tendency may need to be revisited: effects may be overestimated when not controlling for head motion. Furthermore, drug studies with hypnotic, sedative, tranquilizing, or neuromuscular-blocking substances may contain spurious "effects" of reduced atrophy or brain growth simply because they affect motion distinct from true effects of the disease or therapeutic process. Copyright © 2014 Elsevier Inc. All rights reserved.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research
2011-01-01
open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical
Astronomical Software Directory Service
NASA Technical Reports Server (NTRS)
Hanisch, R. J.; Payne, H.; Hayes, J.
1998-01-01
This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.
Quantitative evaluation of software packages for single-molecule localization microscopy.
Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael
2015-08-01
The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.
Browndye: A Software Package for Brownian Dynamics
McCammon, J. Andrew
2010-01-01
A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109
Development of a software package for solid-angle calculations using the Monte Carlo method
NASA Astrophysics Data System (ADS)
Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai
2014-02-01
Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.
Can I Trust This Software Package? An Exercise in Validation of Computational Results
ERIC Educational Resources Information Center
Shacham, Mordechai; Brauner, Neima; Ashurst, W. Robert; Cutlip, Michael B.
2008-01-01
Mathematical software packages such as Polymath, MATLAB, and Mathcad are currently widely used for engineering problem solving. Applications of several of these packages to typical chemical engineering problems have been demonstrated by Cutlip, et al. The main characteristic of these packages is that they provide a "problem-solving environment…
International Inventory of Software Packages in the Information Field.
ERIC Educational Resources Information Center
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Glioblastoma Segmentation: Comparison of Three Different Software Packages.
Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid
2016-01-01
To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh
2014-08-01
Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.
Development of a Nevada Statewide Database for Safety Analyst Software
DOT National Transportation Integrated Search
2017-02-02
Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...
den Bakker, Henk C; Warchocki, Steven; Wright, Emily M; Allred, Adam F; Ahlstrom, Christina; Manuel, Clyde S; Stasiewicz, Matthew J; Burrell, Angela; Roof, Sherry; Strawn, Laura K; Fortes, Esther; Nightingale, Kendra K; Kephart, Daniel; Wiedmann, Martin
2014-06-01
Sampling of agricultural and natural environments in two US states (Colorado and Florida) yielded 18 Listeria-like isolates that could not be assigned to previously described species using traditional methods. Using whole-genome sequencing and traditional phenotypic methods, we identified five novel species, each with a genome-wide average BLAST nucleotide identity (ANIb) of less than 85% to currently described species. Phylogenetic analysis based on 16S rRNA gene sequences and amino acid sequences of 31 conserved loci showed the existence of four well-supported clades within the genus Listeria; (i) a clade representing Listeria monocytogenes, L. marthii, L. innocua, L. welshimeri, L. seeligeri and L. ivanovii, which we refer to as Listeria sensu stricto, (ii) a clade consisting of Listeria fleischmannii and two newly described species, Listeria aquatica sp. nov. (type strain FSL S10-1188(T) = DSM 26686(T) = LMG 28120(T) = BEI NR-42633(T)) and Listeria floridensis sp. nov. (type strain FSL S10-1187(T) = DSM 26687(T) = LMG 28121(T) = BEI NR-42632(T)), (iii) a clade consisting of Listeria rocourtiae, L. weihenstephanensis and three novel species, Listeria cornellensis sp. nov. (type strain TTU A1-0210(T) = FSL F6-0969(T) = DSM 26689(T) = LMG 28123(T) = BEI NR-42630(T)), Listeria grandensis sp. nov. (type strain TTU A1-0212(T) = FSL F6-0971(T) = DSM 26688(T) = LMG 28122(T) = BEI NR-42631(T)) and Listeria riparia sp. nov. (type strain FSL S10-1204(T) = DSM 26685(T) = LMG 28119(T) = BEI NR- 42634(T)) and (iv) a clade containing Listeria grayi. Genomic and phenotypic data suggest that the novel species are non-pathogenic. © 2014 IUMS.
Bos, Judith T; van der Velden, Koos; van der Gulden, Joost W J
2012-01-01
Objectives To investigate differences in associations between sick leave and aspects of health, psychosocial workload, family life and work–family interference between four age groups (<36, 36–45, 46–55 and 55+ years). Design A cross-sectional study; a questionnaire was sent to the home addresses of all employees of a university. Setting A Dutch university. Participants 1843 employees returned the questionnaire (net response: 49.1%). The age distribution was as follows: <36: 32%; 36–45: 26%; 46–55: 27% and 55+: 12%. Primary outcomes Frequent sick leave (FSL, ≥3 times in the past 12 months) and prolonged sick leave (PSL, >2 weeks in total in the past 12 months). Differences between the age groups in independent variables and outcomes were investigated. Logistic regression analysis was used to calculate associations between various variables and the sick leave outcomes. Interaction terms were included to detect differences between the age groups. Results Age differences were found for many work- and family-related characteristics but not in the mean scores for health-related aspects. Presence of chronic disease was reported more frequently with increasing age. The 55+ age group had almost two times less chance of FSL, but 1.6 times more chance of PSL than the <36 age group. Age moderates the associations between career opportunities, partner's contribution in domestic tasks and sex, and FSL. Job security and pay, support from supervisor, challenging work and being breadwinner have different associations with PSL. However, life events in private lives and perceived health complaints are important in all age groups. FSL and PSL have some determinants in common, but there are differences between the outcomes as well. Conclusions Age should be treated as a variable of interest instead of a control variable. Employers and occupational physicians need to be aware that each phase in life has specific difficulties that can lead to FSL and PSL. PMID:22855622
ERIC Educational Resources Information Center
Pollard, Jim
This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Lívia Almeida Bueno; Freitas, Deborah Queiroz
2015-01-01
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Differential maneuvering simulator data reduction and analysis software
NASA Technical Reports Server (NTRS)
Beasley, G. P.; Sigman, R. S.
1972-01-01
A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
Western aeronautical test range real-time graphics software package MAGIC
NASA Technical Reports Server (NTRS)
Malone, Jacqueline C.; Moore, Archie L.
1988-01-01
The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.
CheMentor Software System by H. A. Peoples
NASA Astrophysics Data System (ADS)
Reid, Brian P.
1997-09-01
CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.
ERIC Educational Resources Information Center
Thompson, Douglas E.
2013-01-01
In today's complex music software packages, many features can remain unexplored and unused. Software plug-ins--available in most every music software package, yet easily overlooked in the software's basic operations--are one such feature. In this article, I introduce readers to plug-ins and offer tips for purchasing plug-ins I have…
Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M
2008-01-01
To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM
2009-01-01
Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582
Introduction to Software Packages. [Final Report.
ERIC Educational Resources Information Center
Frankel, Sheila, Ed.; And Others
This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…
NASA Technical Reports Server (NTRS)
Klumpp, A. R.
1994-01-01
Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
Laboratory Connections: Review of Two Commercial Interfacing Packages.
ERIC Educational Resources Information Center
Powers, Michael H.
1989-01-01
Evaluates two Apple II interfacing packages designed to measure pH: (1) "Experiments in Chemistry" by HRM Software and (2) "Voltage Plotter III" by Vernier Software. Provides characteristics and screen dumps of each package. Reports both systems are suitable for high school or beginning college laboratories. (MVL)
Ramakrishnan, Girija; Sen, Bhaswati; Johnson, Richard
2012-01-01
Francisella tularensis subsp. tularensis is a highly infectious bacterium causing acute disease in mammalian hosts. Mechanisms for the acquisition of iron within the iron-limiting host environment are likely to be critical for survival of this intracellular pathogen. FslE (FTT0025) and FupA (FTT0918) are paralogous proteins that are predicted to form β-barrels in the outer membrane of virulent strain Schu S4 and are unique to Francisella species. Previous studies have implicated both FupA, initially identified as a virulence factor and FslE, encoded by the siderophore biosynthetic operon, in iron acquisition. Using single and double mutants, we demonstrated that these paralogs function in concert to promote growth under iron limitation. We used a 55Fe transport assay to demonstrate that FslE is involved in siderophore-mediated ferric iron uptake, whereas FupA facilitates high affinity ferrous iron uptake. Optimal replication within J774A.1 macrophage-like cells required at least one of these uptake systems to be functional. In a mouse model of tularemia, the ΔfupA mutant was attenuated, but the ΔfslE ΔfupA mutant was significantly more attenuated, implying that the two systems of iron acquisition function synergistically to promote virulence. These studies highlight the importance of specific iron acquisition functions, particularly that of ferrous iron, for virulence of F. tularensis in the mammalian host. PMID:22661710
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 80 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 15 consists of 27 packages; set 16 consists of 53 packages. Each software review lists producer, time and place of evaluation,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe
2017-10-01
To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
Technology Assessment Software Package: Final Report.
ERIC Educational Resources Information Center
Hutinger, Patricia L.
This final report describes the Technology Assessment Software Package (TASP) Project, which produced developmentally appropriate technology assessment software for children from 18 months through 8 years of age who have moderate to severe disabilities that interfere with their interaction with people, objects, tasks, and events in their…
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
Software for Managing Personal Files.
ERIC Educational Resources Information Center
Lundeen, Gerald
1989-01-01
Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.
Diagnostic evaluation of three cardiac software packages using a consecutive group of patients
2011-01-01
Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226
A Study of Visualization for Mathematics Education
NASA Technical Reports Server (NTRS)
Daugherty, Sarah C.
2008-01-01
Graphical representations such as figures, illustrations, and diagrams play a critical role in mathematics and they are equally important in mathematics education. However, graphical representations in mathematics textbooks are static, Le. they are used to illustrate only a specific example or a limited set. of examples. By using computer software to visualize mathematical principles, virtually there is no limit to the number of specific cases and examples that can be demonstrated. However, we have not seen widespread adoption of visualization software in mathematics education. There are currently a number of software packages that provide visualization of mathematics for research and also software packages specifically developed for mathematics education. We conducted a survey of mathematics visualization software packages, summarized their features and user bases, and analyzed their limitations. In this survey, we focused on evaluating the software packages for their use with mathematical subjects adopted by institutions of secondary education in the United States (middle schools and high schools), including algebra, geometry, trigonometry, and calculus. We found that cost, complexity, and lack of flexibility are the major factors that hinder the widespread use of mathematics visualization software in education.
Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations
NASA Astrophysics Data System (ADS)
Luc̆ić, Vladan
1995-11-01
An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.
An Ada Linear-Algebra Software Package Modeled After HAL/S
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.; Lawson, Charles L.
1990-01-01
New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.
2011-01-01
Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252
NDAS Hardware Translation Layer Development
NASA Technical Reports Server (NTRS)
Nazaretian, Ryan N.; Holladay, Wendy T.
2011-01-01
The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.
Academic Web Authoring Mulitmedia Development and Course Management Tools
ERIC Educational Resources Information Center
Halloran, Margaret E.
2005-01-01
Course management software enables faculty members to learn one software package for web-based curriculum, assessment, synchronous and asynchronous discussions, collaborative work, multimedia and interactive resource development. There are as many as 109 different course management software packages on the market and several studies have evaluated…
Kasiri, Keyvan; Kazemi, Kamran; Dehghani, Mohammad Javad; Helfroush, Mohammad Sadegh
2013-01-01
In this paper, we present a new semi-automatic brain tissue segmentation method based on a hybrid hierarchical approach that combines a brain atlas as a priori information and a least-square support vector machine (LS-SVM). The method consists of three steps. In the first two steps, the skull is removed and the cerebrospinal fluid (CSF) is extracted. These two steps are performed using the toolbox FMRIB's automated segmentation tool integrated in the FSL software (FSL-FAST) developed in Oxford Centre for functional MRI of the brain (FMRIB). Then, in the third step, the LS-SVM is used to segment grey matter (GM) and white matter (WM). The training samples for LS-SVM are selected from the registered brain atlas. The voxel intensities and spatial positions are selected as the two feature groups for training and test. SVM as a powerful discriminator is able to handle nonlinear classification problems; however, it cannot provide posterior probability. Thus, we use a sigmoid function to map the SVM output into probabilities. The proposed method is used to segment CSF, GM and WM from the simulated magnetic resonance imaging (MRI) using Brainweb MRI simulator and real data provided by Internet Brain Segmentation Repository. The semi-automatically segmented brain tissues were evaluated by comparing to the corresponding ground truth. The Dice and Jaccard similarity coefficients, sensitivity and specificity were calculated for the quantitative validation of the results. The quantitative results show that the proposed method segments brain tissues accurately with respect to corresponding ground truth. PMID:24696800
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
Johansson, Jarkko; Alakurtti, Kati; Joutsa, Juho; Tohka, Jussi; Ruotsalainen, Ulla; Rinne, Juha O
2016-10-01
The striatum is the primary target in regional C-raclopride-PET studies, and despite its small volume, it contains several functional and anatomical subregions. The outcome of the quantitative dopamine receptor study using C-raclopride-PET depends heavily on the quality of the region-of-interest (ROI) definition of these subregions. The aim of this study was to evaluate subregional analysis techniques because new approaches have emerged, but have not yet been compared directly. In this paper, we compared manual ROI delineation with several automatic methods. The automatic methods used either direct clustering of the PET image or individualization of chosen brain atlases on the basis of MRI or PET image normalization. State-of-the-art normalization methods and atlases were applied, including those provided in the FreeSurfer, Statistical Parametric Mapping8, and FSL software packages. Evaluation of the automatic methods was based on voxel-wise congruity with the manual delineations and the test-retest variability and reliability of the outcome measures using data from seven healthy male participants who were scanned twice with C-raclopride-PET on the same day. The results show that both manual and automatic methods can be used to define striatal subregions. Although most of the methods performed well with respect to the test-retest variability and reliability of binding potential, the smallest average test-retest variability and SEM were obtained using a connectivity-based atlas and PET normalization (test-retest variability=4.5%, SEM=0.17). The current state-of-the-art automatic ROI methods can be considered good alternatives for subjective and laborious manual segmentation in C-raclopride-PET studies.
Modeling and MBL: Software Tools for Science.
ERIC Educational Resources Information Center
Tinker, Robert F.
Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…
A Characteristics Approach to the Evaluation of Economics Software Packages.
ERIC Educational Resources Information Center
Lumsden, Keith; Scott, Alex
1988-01-01
Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Interactive Visualization of Assessment Data: The Software Package Mondrian
ERIC Educational Resources Information Center
Unlu, Ali; Sargin, Anatol
2009-01-01
Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…
An Overview of Software for Conducting Dimensionality Assessment in Multidimensional Models
ERIC Educational Resources Information Center
Svetina, Dubravka; Levy, Roy
2012-01-01
An overview of popular software packages for conducting dimensionality assessment in multidimensional models is presented. Specifically, five popular software packages are described in terms of their capabilities to conduct dimensionality assessment with respect to the nature of analysis (exploratory or confirmatory), types of data (dichotomous,…
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 170 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 11 consists of 37 packages. Set 12 consists of 34 packages. A special unnumbered set, entitled LIBRA Reviews, treats 99 packages…
DRUGDOG 3.0: U.S. Navy Random Urinalysis Software Package
1994-03-15
NAVAL PO11GRADUATE SCHOOL Monterey, California AD-A281 748 THESIS LJuEoTE DRUGDOG 3.0: U. S . NAVY RANDOM URINALYSIS SOFTWARE PACKAGE by (% Dale E...ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED 15 MAR 94 Master’s Thesis 4. TITLE AND SUBTITLE DRUGDOG 3.0: U. S . NAVY RANDOM 5...FUNDING NUMBERS URINALYSIS SOFTWARE PACKAGE 6. AUTHOR( S ) Dale E. Wilson 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8. PERFORMING Naval
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Improving the quality of EHR recording in primary care: a data quality feedback tool.
van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A
2017-01-01
Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Oberholzer, Inge; Möller, Marisa; Holland, Brendan; Dean, Olivia M; Berk, Michael; Harvey, Brian H
2018-04-01
There is abundant evidence for both disorganized redox balance and cognitive deficits in major depressive disorder (MDD). Garcinia mangostana Linn (GM) has anti-oxidant activity. We studied the antidepressant-like and pro-cognitive effects of raw GM rind in Flinders Sensitive Line (FSL) rats, a genetic model of depression, following acute and chronic treatment compared to a reference antidepressant, imipramine (IMI). The chemical composition of the GM extract was analysed for levels of α- and γ-mangostin. The acute dose-dependent effects of GM (50, 150 and 200 mg/kg po), IMI (20 mg/kg po) and vehicle were determined in the forced swim test (FST) in FSL rats, versus Flinders Resistant Line (FRL) control rats. Locomotor testing was conducted using the open field test (OFT). Using the most effective dose above coupled with behavioral testing in the FST and cognitive assessment in the novel object recognition test (nORT), a fixed dose 14-day treatment study of GM was performed and compared to IMI- (20 mg/kg/day) and vehicle-treated animals. Chronic treated animals were also assessed with respect to frontal cortex and hippocampal monoamine levels and accumulation of malondialdehyde. FSL rats showed significant cognitive deficits and depressive-like behavior, with disordered cortico-hippocampal 5-hydroxyindole acetic acid (5-HIAA) and noradrenaline (NA), as well as elevated hippocampal lipid peroxidation. Acute and chronic IMI treatment evoked pronounced antidepressant-like effects. Raw GM extract contained 117 mg/g and 11 mg/g α- and γ-mangostin, respectively, with acute GM demonstrating antidepressant-like effects at 50 mg/kg/day. Chronic GM (50 mg/kg/d) displayed significant antidepressant- and pro-cognitive effects, while demonstrating parity with IMI. Both behavioral and monoamine assessments suggest a more prominent serotonergic action for GM as opposed to a noradrenergic action for IMI, while both IMI and GM reversed hippocampal lipid peroxidation in FSL animals. Concluding, FSL rats present with cognitive deficits and depressive-like behaviors that are reversed by acute and chronic GM treatment, similar to that of IMI.
NASA Astrophysics Data System (ADS)
Calhoun, William R., III
One of the most recent advancements in laser technology is the development of ultrashort pulsed femtosecond lasers (FSLs). FSLs are improving many fields due to their unique extreme precision, low energy and ablation characteristics. In the area of laser medicine, ophthalmic surgeries have seen very promising developments. Some of the most commonly performed surgical operations in the world, including laser-assisted in-situ keratomileusis (LASIK), lens replacement (cataract surgery), and keratoplasty (cornea transplant), now employ FSLs for their unique abilities that lead to improved clinical outcome and patient satisfaction. The application of FSLs in medical therapeutics is a recent development, and although they offer many benefits, FSLs also stimulate nonlinear optical effects (NOEs), many of which were insignificant with previously developed lasers. NOEs can change the laser characteristics during propagation through a medium, which can subsequently introduce unique safety concerns for the surrounding tissues. Traditional approaches for characterizing optical effects, laser performance, safety and efficacy do not properly account for NOEs, and there remains a lack of data that describe NOEs in clinically relevant procedures and tissues. As FSL technology continues to expand towards new applications, FSL induced NOEs need to be better understood in order to ensure safety as FSL medical devices and applications continue to evolve at a rapid pace. In order to improve the understanding of FSL-tissue interactions related to NOEs stimulated during laser beam propagation though corneal tissue, research investigations were conducted to evaluate corneal optical properties and determine how corneal tissue properties including corneal layer, collagen orientation and collagen crosslinking, and laser parameters including pulse energy, repetition rate and numerical aperture affect second and third-harmonic generation (HG) intensity, duration and efficiency. The results of these studies revealed that all laser parameters and tissue properties had a substantial influence on HG. The dynamic relationship between optical breakdown and HG was responsible for many observed changes in HG metrics. The results also demonstrated that the new generation of therapeutic FSLs has the potential to generate hazardous effects if not carefully controlled. Finally, recommendations are made to optimize current and guide future FSL applications.
[Femtosecond laser in cataract surgery. A critical appraisal].
Menapace, R M; Dick, H B
2014-01-01
The use of femtosecond lasers (FSL) is increasingly spreading in cataract surgery. Potential advantages over standard manual cataract surgery are the superior precision of corneal incisions and capsular openings as well as the reduction of ultrasound energy for lens nucleus work-up. Exact positioning and dimensioning of the anterior capsular opening should help reduce decentration and tilt of the intraocular lens (IOL) optics and thus achieve better target refraction. Together with the possibility to correct low-grade corneal astigmatism by precise arcuate incision, FSL technology is expected to convert cataract surgery from a purely curative into a refractive procedure. Apart from own experiences this review article critically analyses the pertinent literature published so far as well as congress presentations and personal reports of other FSL surgeons. The advantages and disadvantages are scrutinized with regard to their impact on the surgical and refractive results and compared with those experienced by the authors with manual cataract surgery over several decades. Economic and healthcare political aspects are also addressed. The use of FSL surgery improves the precision and reproducibility of corneal incisions and the capsular opening and reduces the amount of ultrasound energy required for lens nucleus work-up. However, the clinical benefits must be put into perspective due to the subsequent surgical manipulation of the incisions (during lens emulsification, aspiration and IOL injection), the lacking possibility to visualize the crystalline lens equator as the reference for correct capsulotomy centration and the relativity of ultrasound energy consumption on the corneal endothelial trauma. This is of particular relevance against the background of the significantly higher costs. Conversely, tears of the anterior capsule edge which, apart from interfering with correct IOL positioning, may entail serious complications presently occur more frequently with all FSL instruments. From the economic and healthcare political viewpoint, thought should be given to the possible acquisition of the cataract surgical business by the industry or investors, as cataract surgery is a high-volume standardized procedure with enormous future potential. This could fundamentally change our currently decentralized and individualized structures and subsequently the steam of patient and make surgeons largely dependent or superfluous.
Information Metacatalog for a Grid
NASA Technical Reports Server (NTRS)
Kolano, Paul
2007-01-01
SWIM is a Software Information Metacatalog that gathers detailed information about the software components and packages installed on a grid resource. Information is currently gathered for Executable and Linking Format (ELF) executables and shared libraries, Java classes, shell scripts, and Perl and Python modules. SWIM is built on top of the POUR framework, which is described in the preceding article. SWIM consists of a set of Perl modules for extracting software information from a system, an XML schema defining the format of data that can be added by users, and a POUR XML configuration file that describes how these elements are used to generate periodic, on-demand, and user-specified information. Periodic software information is derived mainly from the package managers used on each system. SWIM collects information from native package managers in FreeBSD, Solaris, and IRX as well as the RPM, Perl, and Python package managers on multiple platforms. Because not all software is available, or installed in package form, SWIM also crawls the set of relevant paths from the File System Hierarchy Standard that defines the standard file system structure used by all major UNIX distributions. Using these two techniques, the vast majority of software installed on a system can be located. SWIM computes the same information gathered by the periodic routines for specific files on specific hosts, and locates software on a system given only its name and type.
The Package-Based Development Process in the Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil
1997-01-01
The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M
2011-09-10
The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Calhoun, William R.; Ilev, Ilko K.
2016-03-01
Some of the most commonly performed surgical operations in the world, including laser-assisted in-situ keratomileusis (LASIK), lens replacement (e.g. cataract surgery), and keratoplasty (cornea transplant), now employ therapeutic infrared femtosecond lasers (FSLs) for their extreme precision, low energy delivered into tissue and advanced ablation characteristics. Although the widely exploited applications of FSLs in medical therapeutics offer significant benefits, FSLs must generate very high intensities in order to achieve optical breakdown, the predominant tissue ablative mechanism, which can also stimulate nonlinear optical effects such as harmonic generation, an effect that generates coherent visible and UV light in the case of second- (SHG) and third-harmonic generation (THG), respectively. In order to improve the understanding of HG in corneal tissue, the effect of FSL polarization and pulse energy were investigated. FSL stimulated SHG intensity in corneal tissue was measured as the laser polarization was rotated 360 degrees. Further, the pulse energy at the SHG wavelength were measured for single FSL pulses as the pulse energy at the fundamental wavelength was varied through a range of clinically relevant values. The results of this study revealed SHG intensity oscillated with laser polarization, having a variation greater than 20%. This relationship seems to due to the intrinsic anisotropy of collagen fibril hyperpolarizability, not related to tissue birefringence. SHG pulse energy measurements showed an increase in SHG pulse energy with increasing FSL pulse energy, however conversion efficiency decreased. This may be related to the dynamic relationship between optical breakdown leading to tissue destruction and HG evolution.
General-Purpose Ada Software Packages
NASA Technical Reports Server (NTRS)
Klumpp, Allan R.
1991-01-01
Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Advance Directives and Do Not Resuscitate Orders
... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...
Nested Cohort - R software package
NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
Software packager user's guide
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.
ERIC Educational Resources Information Center
Weaver, Dave, Ed.
This document consists of 30 microcomputer software package evaluations prepared for the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory (NWREL). The concise, single-sheet resume describing and evaluating each software package includes source, cost, ability level,…
ERIC Educational Resources Information Center
Rhein, Deborah; Alibrandi, Mary; Lyons, Mary; Sammons, Janice; Doyle, Luther
This bibliography, developed by Project RIMES (Reading Instructional Methods of Efficacy with Students) lists 80 software packages for teaching early reading and spelling to students at risk for reading and spelling failure. The software packages are presented alphabetically by title. Entries usually include a grade level indicator, a brief…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
Software Library for Bruker TopSpin NMR Data Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.
Investigation into the development of computer aided design software for space based sensors
NASA Technical Reports Server (NTRS)
Pender, C. W.; Clark, W. L.
1987-01-01
The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.
Software Review. Macintosh Laboratory Automation: Three Software Packages.
ERIC Educational Resources Information Center
Jezl, Barbara Ann
1990-01-01
Reviewed are "LABTECH NOTEBOOK,""LabVIEW," and "Parameter Manager pmPLUS/pmTALK." Each package is described including functions, uses, hardware, and costs. Advantages and disadvantages of this type of laboratory approach are discussed. (CW)
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
Steyn, Rachelle; Boniaszczuk, John; Geldenhuys, Theodore
2014-01-01
To determine how two software packages, supplied by Siemens and Hermes, for processing gated blood pool (GBP) studies should be used in our department and whether the use of different cameras for the acquisition of raw data influences the results. The study had two components. For the first component, 200 studies were acquired on a General Electric (GE) camera and processed three times by three operators using the Siemens and Hermes software packages. For the second part, 200 studies were acquired on two different cameras (GE and Siemens). The matched pairs of raw data were processed by one operator using the Siemens and Hermes software packages. The Siemens method consistently gave estimates that were 4.3% higher than the Hermes method (p < 0.001). The differences were not associated with any particular level of left ventricular ejection fraction (LVEF). There was no difference in the estimates of LVEF obtained by the three operators (p = 0.1794). The reproducibility of estimates was good. In 95% of patients, using the Siemens method, the SD of the three estimates of LVEF by operator 1 was ≤ 1.7, operator 2 was ≤ 2.1 and operator 3 was ≤ 1.3. The corresponding values for the Hermes method were ≤ 2.5, ≤ 2.0 and ≤ 2.1. There was no difference in the results of matched pairs of data acquired on different cameras (p = 0.4933) CONCLUSION: Software packages for processing GBP studies are not interchangeable. The report should include the name and version of the software package used. Wherever possible, the same package should be used for serial studies. If this is not possible, the report should include the limits of agreement of the different packages. Data acquisition on different cameras did not influence the results.
ELAS: A powerful, general purpose image processing package
NASA Technical Reports Server (NTRS)
Walters, David; Rickman, Douglas
1991-01-01
ELAS is a software package which has been utilized as an image processing tool for more than a decade. It has been the source of several commercial packages. Now available on UNIX workstations it is a very powerful, flexible set of software. Applications at Stennis Space Center have included a very wide range of areas including medicine, forestry, geology, ecological modeling, and sonar imagery. It remains one of the most powerful image processing packages available, either commercially or in the public domain.
On the release of cppxfel for processing X-ray free-electron laser images.
Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian
2016-06-01
As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
On the release of cppxfel for processing X-ray free-electron laser images
Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...
2016-05-11
As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
Tondare, Vipin N; Villarrubia, John S; Vlada R, András E
2017-10-01
Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.
Bernard, Aurélien; He, Zhiguo; Gauthier, Anne Sophie; Trone, Marie Caroline; Baubeau, Emmanuel; Forest, Fabien; Dumollard, Jean Marc; Peocʼh, Michel; Thuret, Gilles; Gain, Philippe
2015-02-01
Stromal surface quality of endothelial lamellae cut for endothelial keratoplasty with a femtosecond laser (FSL) with epithelial applanation remains disappointing. Applanation of the endothelial side of the cornea, mounted inverted on an artificial chamber, has therefore been proposed to improve cut quality. We compared lamellar quality after FSL cutting using epithelial versus endothelial applanation. Lamellae were cut with an FSL from organ-cultured corneas. After randomization, 7 were cut with epithelial applanation and 7 with endothelial applanation. Lamellae of 50-, 75-, and 100-μm thickness were targeted. Thickness was measured by optical coherence tomography before and immediately after cutting. Viable endothelial cell density was quantified immediately after cutting using triple labeling with Hoechst/ethidium/calcein-AM coupled with image analysis with ImageJ. The stromal surface was evaluated by 9 masked observers using semiquantitative scoring of scanning electronic microscopy images. Histology of 2 samples was also analyzed before lamellar detachment. Precision (difference in target/actual thickness) and thickness regularity [coefficient of variation (CV) of 10 measurements] were significantly better with endothelial applanation (precision: 18 μm; range, 10-30; CV: 11%; range, 8-12) than with epithelial applanation (precision: 84 μm; range, 54-107; P = 0.002; CV: 24%; range, 13-47; P = 0.001). Endothelial applanation provided thinner lamellae. However, viable endothelial cell density was significantly lower after endothelial applanation (1183 cells/mm2; range, 787-1725 versus 1688 cells/mm2; range, 1288-2025; P = 0.018). FSL cutting of endothelial lamellae using endothelial applanation provides thinner more regular grafts with more predictable thickness than with conventional epithelial applanation but strongly reduces the pool of viable endothelial cells.
Swanepoel, Tanya; Harvey, Brian H; Harden, Lois M; Laburn, Helen P; Mitchell, Duncan
2011-11-01
To investigate potential consequences for learning and memory, we have simulated the effects of Mycoplasma infection, in rats, by administering fibroblast-stimulating lipopepide-1 (FSL-1), a pyrogenic moiety of Mycoplasma salivarium. We measured the effects on body temperature, cage activity, food intake, and on spatial learning and memory in a Morris Water Maze. Male Sprague-Dawley rats had radio transponders implanted to measure abdominal temperature and cage activity. After recovery, rats were assigned randomly to receive intraperitoneal (I.P.) injections of FSL-1 (500 or 1000 μg kg(-1) in 1 ml kg(-1) phosphate-buffered saline; PBS) or vehicle (PBS, 1 ml kg(-1)). Body mass and food intake were measured daily. Training in the Maze commenced 18 h after injections and continued daily for four days. Spatial memory was assessed on the fifth day. In other rats, we measured concentrations of brain pro-inflammatory cytokines, interleukin (IL)-1β and IL-6, at 3 and 18 h after injections. FSL-1 administration induced a dose-dependent fever (∼1°C) for two days, lethargy (∼78%) for four days, anorexia (∼65%) for three days and body mass stunting (∼6%) for at least four days. Eighteen hours after FSL-1 administration, when concentrations of IL-1β, but not that of IL-6, were elevated in both the hypothalamus and the hippocampus, and when rats were febrile, lethargic and anorexic, learning in the Maze was unaffected. There also was no memory impairment. Our results support emerging evidence that impaired learning and memory is not inevitable during simulated infection. Copyright © 2011 Elsevier Inc. All rights reserved.
du Jardin, Kristian Gaarn; Liebenberg, Nico; Müller, Heidi Kaastrup; Elfving, Betina; Sanchez, Connie; Wegener, Gregers
2016-07-01
The mechanisms mediating ketamine's antidepressant effect have only been partly resolved. Recent preclinical reports implicate serotonin (5-hydroxytryptamine; 5-HT) in the antidepressant-like action of ketamine. Vortioxetine is a multimodal-acting antidepressant that is hypothesized to exert its therapeutic activity through 5-HT reuptake inhibition and modulation of several 5-HT receptors. The objective of this study was to evaluate the therapeutic-like profiles of S-ketamine, vortioxetine, and the serotonin reuptake inhibitor fluoxetine in response to manipulation of 5-HT tone. Flinders Sensitive Line (FSL) rats, a genetic model of depression, were depleted of 5-HT by repeated administration of 4-chloro-DL-phenylalanine methyl ester HCl (pCPA). Using pCPA-pretreated and control FSL rats, we investigated the acute and sustained effects of S-ketamine (15 mg/kg), fluoxetine (10 mg/kg), or vortioxetine (10 mg/kg) on recognition memory and depression-like behavior in the object recognition task (ORT) and forced swim test (FST), respectively. The behavioral phenotype of FSL rats was unaffected by 5-HT depletion. Vortioxetine, but not fluoxetine or S-ketamine, acutely ameliorated the memory deficits of FSL rats in the ORT irrespective of 5-HT tone. No sustained effects were observed in the ORT. In the FST, all three drugs demonstrated acute antidepressant-like activity but only S-ketamine had sustained effects. Unlike vortioxetine, the antidepressant-like responses of fluoxetine and S-ketamine were abolished by 5-HT depletion. These observations suggest that the acute and sustained antidepressant-like effects of S-ketamine depend on endogenous stimulation of 5-HT receptors. In contrast, the acute therapeutic-like effects of vortioxetine on memory and depression-like behavior may be mediated by direct activity at 5-HT receptors.
MOPEX: a software package for astronomical image processing and visualization
NASA Astrophysics Data System (ADS)
Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley
2006-06-01
We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.
User Documentation for Multiple Software Releases
NASA Technical Reports Server (NTRS)
Humphrey, R.
1982-01-01
In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.
Versatile Software Package For Near Real-Time Analysis of Experimental Data
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.
1998-01-01
This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis
2011-01-01
Background A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. Results The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. Conclusions With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites. PMID:21266047
Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis.
Nemoto, Kiyotaka; Dan, Ippeita; Rorden, Christopher; Ohnishi, Takashi; Tsuzuki, Daisuke; Okamoto, Masako; Yamashita, Fumio; Asada, Takashi
2011-01-25
A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites.
Code of Federal Regulations, 2013 CFR
2013-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2014 CFR
2014-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2012 CFR
2012-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Code of Federal Regulations, 2010 CFR
2010-10-01
... fully evaluate evidence, all spreadsheets must be fully accessible and manipulable. Electronic databases... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...
Environmental databases and other computerized information tools
NASA Technical Reports Server (NTRS)
Clark-Ingram, Marceia
1995-01-01
Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.
ERIC Educational Resources Information Center
Radcliffe, George; And Others
1988-01-01
Reviews three software packages: 1) a package containing 68 programs covering general topics in chemistry; 2) a package dealing with acid-base titration curves and allows for variables to be changed; 3) a chemistry tutorial and drill package. (MVL)
Assessing the Effects of Software Platforms on Volumetric Segmentation of Glioblastoma
Dunn, William D.; Aerts, Hugo J.W.L.; Cooper, Lee A.; Holder, Chad A.; Hwang, Scott N.; Jaffe, Carle C.; Brat, Daniel J.; Jain, Rajan; Flanders, Adam E.; Zinn, Pascal O.; Colen, Rivka R.; Gutman, David A.
2017-01-01
Background Radiological assessments of biologically relevant regions in glioblastoma have been associated with genotypic characteristics, implying a potential role in personalized medicine. Here, we assess the reproducibility and association with survival of two volumetric segmentation platforms and explore how methodology could impact subsequent interpretation and analysis. Methods Post-contrast T1- and T2-weighted FLAIR MR images of 67 TCGA patients were segmented into five distinct compartments (necrosis, contrast-enhancement, FLAIR, post contrast abnormal, and total abnormal tumor volumes) by two quantitative image segmentation platforms - 3D Slicer and a method based on Velocity AI and FSL. We investigated the internal consistency of each platform by correlation statistics, association with survival, and concordance with consensus neuroradiologist ratings using ordinal logistic regression. Results We found high correlations between the two platforms for FLAIR, post contrast abnormal, and total abnormal tumor volumes (spearman’s r(67) = 0.952, 0.959, and 0.969 respectively). Only modest agreement was observed for necrosis and contrast-enhancement volumes (r(67) = 0.693 and 0.773 respectively), likely arising from differences in manual and automated segmentation methods of these regions by 3D Slicer and Velocity AI/FSL, respectively. Survival analysis based on AUC revealed significant predictive power of both platforms for the following volumes: contrast-enhancement, post contrast abnormal, and total abnormal tumor volumes. Finally, ordinal logistic regression demonstrated correspondence to manual ratings for several features. Conclusion Tumor volume measurements from both volumetric platforms produced highly concordant and reproducible estimates across platforms for general features. As automated or semi-automated volumetric measurements replace manual linear or area measurements, it will become increasingly important to keep in mind that measurement differences between segmentation platforms for more detailed features could influence downstream survival or radio genomic analyses. PMID:29600296
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha
2012-10-19
The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less
NASA Technical Reports Server (NTRS)
Wolf, Stephen W. D.
1988-01-01
The Wall Adjustment Strategy (WAS) software provides successful on-line control of the 2-D flexible walled test section of the Langley 0.3-m Transonic Cryogenic Tunnel. This software package allows the level of operator intervention to be regulated as necessary for research and production type 2-D testing using and Adaptive Wall Test Section (AWTS). The software is designed to accept modification for future requirements, such as 3-D testing, with a minimum of complexity. The WAS software described is an attempt to provide a user friendly package which could be used to control any flexible walled AWTS. Control system constraints influence the details of data transfer, not the data type. Then this entire software package could be used in different control systems, if suitable interface software is available. A complete overview of the software highlights the data flow paths, the modular architecture of the software and the various operating and analysis modes available. A detailed description of the software modules includes listings of the code. A user's manual is provided to explain task generation, operating environment, user options and what to expect at execution.
Ellefsen, Karl J.
2017-06-27
MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.
Biocompatibility of the micro-patterned NiTi surface produced by femtosecond laser
NASA Astrophysics Data System (ADS)
Liang, Chunyong; Wang, Hongshui; Yang, Jianjun; Li, Baoe; Yang, Yang; Li, Haipeng
2012-11-01
Biocompatibility of the micro-patterned NiTi surface produced by femtosecond laser (FSL) was studied in this work. The surface characteristics of the laser treated NiTi alloys were investigated by scanning electron microscopy (SEM), atom force microscopy (AFM), X-ray diffractometry (XRD) and X-ray photoelectron spectrum (XPS). The biocompatibility was evaluated by in vitro cell culture test. The results showed that, grooves, ripples, which covered by nanoparticles were formed on the sample surfaces, and the Ni/Ti ratio on the alloy surface increased with increasing laser energy. The crystal structure was not changed by laser treatment. However, the cell culture test proved that the micro-patterns induced by FSL were beneficial to improve the biocompatibility of NiTi alloys: the growth of osteoblasts oriented along the grooves, a large amount of synapses and filopodias were formed due to the ripples, holes and nanoparticles on the alloy surface, and the proliferation rate and alkaline phosphatase (ALP) content of cells were increased after FSL treatment. However, due to the toxicity of Ni ions on cell growth, the NiTi alloy surface should not be treated by laser fluence of more than 3.82 J/cm2 to obtain the ideal biocompatibility.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Microsoft Open Database Connectivity (ODBC) standard. ODBC is a Windows technology that allows a database software package to import data from a database created using a different software package. We currently...-compatible format. All databases must be supported with adequate documentation on data attributes, SQL...
Scout 2008 Version 1.0 User Guide
The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...
INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT
SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....
’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront
2014-04-30
software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 68 microcomputer software package evaluations prepared by MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. There are 26 packages in set 13 and 42 in set 14. Each software review lists producer, time and place of evaluation, cost, ability level,…
PIV Data Validation Software Package
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.
AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S
NASA Technical Reports Server (NTRS)
Klumpp, A. R.
1994-01-01
This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
New generation of exploration tools: interactive modeling software and microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, S.A.
1986-08-01
Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
Diagnosis diagrams for passing signals on an automatic block signaling railway section
NASA Astrophysics Data System (ADS)
Spunei, E.; Piroi, I.; Chioncel, C. P.; Piroi, F.
2018-01-01
This work presents a diagnosis method for railway traffic security installations. More specifically, the authors present a series of diagnosis charts for passing signals on a railway block equipped with an automatic block signaling installation. These charts are based on the exploitation electric schemes, and are subsequently used to develop a diagnosis software package. The thus developed software package contributes substantially to a reduction of failure detection and remedy for these types of installation faults. The use of the software package eliminates making wrong decisions in the fault detection process, decisions that may result in longer remedy times and, sometimes, to railway traffic events.
Orbit determination for ISRO satellite missions
NASA Astrophysics Data System (ADS)
Rao, Ch. Sreehari; Sinha, S. K.
Indian Space Research Organisation (ISRO) has been successful in using the in-house developed orbit determination and prediction software for satellite missions of Bhaskara, Rohini and APPLE. Considering the requirements of satellite missions, software packages are developed, tested and their accuracies are assessed. Orbit determination packages developed are SOIP, for low earth orbits of Bhaskara and Rohini missions, ORIGIN and ODPM, for orbits related to all phases of geo-stationary missions and SEGNIP, for drift and geo-stationary orbits. Software is tested and qualified using tracking data of SIGNE-3, D5-B, OTS, SYMPHONIE satellites with the help of software available with CNES, ESA and DFVLR. The results match well with those available from these agencies. These packages have supported orbit determination successfully throughout the mission life for all ISRO satellite missions. Member-Secretary
Prototyping with Data Dictionaries for Requirements Analysis.
1985-03-01
statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in
Electronic and software subsystems for an autonomous roving vehicle. M.S. Thesis
NASA Technical Reports Server (NTRS)
Doig, G. A.
1980-01-01
The complete electronics packaging which controls the Mars roving vehicle is described in order to provide a broad overview of the systems that are part of that package. Some software debugging tools are also discussed. Particular emphasis is given to those systems that are controlled by the microprocessor. These include the laser mast, the telemetry system, the command link prime interface board, and the prime software.
The Shock and Vibration Digest. Volume 17, Number 4
1985-04-01
software packages for engineering signed to be easy to use from the outset, computations which were specifically writ- and this design philosophy is largely...re- ten for use on microcomputers. Software sponsible for their increasing popularity; packages related to shock and vibration are this same design...philosophy appears to have available for both experimental and for been carried over to the design of today’s analytical applications. Typical software
Hierarchical Petascale Simulation Framework For Stress Corrosion Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grama, Ananth
2013-12-18
A number of major accomplishments resulted from the project. These include: • Data Structures, Algorithms, and Numerical Methods for Reactive Molecular Dynamics. We have developed a range of novel data structures, algorithms, and solvers (amortized ILU, Spike) for use with ReaxFF and charge equilibration. • Parallel Formulations of ReactiveMD (Purdue ReactiveMolecular Dynamics Package, PuReMD, PuReMD-GPU, and PG-PuReMD) for Messaging, GPU, and GPU Cluster Platforms. We have developed efficient serial, parallel (MPI), GPU (Cuda), and GPU Cluster (MPI/Cuda) implementations. Our implementations have been demonstrated to be significantly better than the state of the art, both in terms of performance and scalability.more » • Comprehensive Validation in the Context of Diverse Applications. We have demonstrated the use of our software in diverse systems, including silica-water, silicon-germanium nanorods, and as part of other projects, extended it to applications ranging from explosives (RDX) to lipid bilayers (biomembranes under oxidative stress). • Open Source Software Packages for Reactive Molecular Dynamics. All versions of our soft- ware have been released over the public domain. There are over 100 major research groups worldwide using our software. • Implementation into the Department of Energy LAMMPS Software Package. We have also integrated our software into the Department of Energy LAMMPS software package.« less
Developing a Virtual Physics World
ERIC Educational Resources Information Center
Wegener, Margaret; McIntyre, Timothy J.; McGrath, Dominic; Savage, Craig M.; Williamson, Michael
2012-01-01
In this article, the successful implementation of a development cycle for a physics teaching package based on game-like virtual reality software is reported. The cycle involved several iterations of evaluating students' use of the package followed by instructional and software development. The evaluation used a variety of techniques, including…
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Description of the IV + V System Software Package.
ERIC Educational Resources Information Center
Microcomputers for Information Management: An International Journal for Library and Information Services, 1984
1984-01-01
Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…
User’s guide for GcClust—An R package for clustering of regional geochemical data
Ellefsen, Karl J.; Smith, David B.
2016-04-08
GcClust is a software package developed by the U.S. Geological Survey for statistical clustering of regional geochemical data, and similar data such as regional mineralogical data. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of the user’s guide are bundled together in R’s unit of sharable code, which is called a “package.” The user’s guide includes step-by-step instructions showing how the functions are used to cluster data and to evaluate the clustering results. These functions are demonstrated in this report using test data, which are included in the package.
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Evolution of a modular software network
Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.
2011-01-01
“Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260
Image analysis software versus direct anthropometry for breast measurements.
Quieregatto, Paulo Rogério; Hochman, Bernardo; Furtado, Fabianne; Machado, Aline Fernanda Perez; Sabino Neto, Miguel; Ferreira, Lydia Masako
2014-10-01
To compare breast measurements performed using the software packages ImageTool(r), AutoCAD(r) and Adobe Photoshop(r) with direct anthropometric measurements. Points were marked on the breasts and arms of 40 volunteer women aged between 18 and 60 years. When connecting the points, seven linear segments and one angular measurement on each half of the body, and one medial segment common to both body halves were defined. The volunteers were photographed in a standardized manner. Photogrammetric measurements were performed by three independent observers using the three software packages and compared to direct anthropometric measurements made with calipers and a protractor. Measurements obtained with AutoCAD(r) were the most reproducible and those made with ImageTool(r) were the most similar to direct anthropometry, while measurements with Adobe Photoshop(r) showed the largest differences. Except for angular measurements, significant differences were found between measurements of line segments made using the three software packages and those obtained by direct anthropometry. AutoCAD(r) provided the highest precision and intermediate accuracy; ImageTool(r) had the highest accuracy and lowest precision; and Adobe Photoshop(r) showed intermediate precision and the worst accuracy among the three software packages.
PyPedal, an open source software package for pedigree analysis
USDA-ARS?s Scientific Manuscript database
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
A Simple Interactive Software Package for Plotting, Animating, and Calculating
ERIC Educational Resources Information Center
Engelhardt, Larry
2012-01-01
We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…
A Software Development Approach for Computer Assisted Language Learning
ERIC Educational Resources Information Center
Cushion, Steve
2005-01-01
Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
A Multi-User Microcomputer System for Small Libraries.
ERIC Educational Resources Information Center
Leggate, Peter
1988-01-01
Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)
Microcomputer Software Programs for Vocational Education.
ERIC Educational Resources Information Center
Rodenstein, Judith, Ed.; Lambert, Roger, Ed.
Over 200 microcomputer software packages applicable to vocational education are listed. Most of the programs are available for the Apple, TRS-80, and Commodore microcomputers. The packages have been reviewed, but have not been formally evaluated. Titles of the programs with names and addresses of the distributors are provided. Telephone numbers…
Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.
ERIC Educational Resources Information Center
Happer, Stephanie K.
Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…
Datson, D J; Carter, N G
1988-10-01
The use of personal computers in accountancy and business generally has been stimulated by the availability of flexible software packages. We describe the implementation of a commercial software package designed for interfacing with laboratory instruments and highlight the ease with which it can be implemented, without the need for specialist computer programming staff.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piburn, Jesse
2016-04-22
Availability of accessing the World Bank Data API through the R language was limited to one existing package, which is limited in its ability. The software provides access to all of the features in World Bank API in one software package for the R language and provides functions for searching and downloading data from the World Bank API.
Propensity Score Analysis in R: A Software Review
ERIC Educational Resources Information Center
Keller, Bryan; Tipton, Elizabeth
2016-01-01
In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.
Advanced fingerprint verification software
NASA Astrophysics Data System (ADS)
Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.
2016-05-01
We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.
Jumelle, Clotilde; Hamri, Alina; Egaud, Gregory; Mauclair, Cyril; Reynaud, Stephanie; Dumas, Virginie; Pereira, Sandrine; Garcin, Thibaud; Gain, Philippe; Thuret, Gilles
2017-01-01
Corneal lamellar cutting with a blade or femtosecond laser (FSL) is commonly used during refractive surgery and corneal grafts. Surface roughness of the cutting plane influences postoperative visual acuity but is difficult to assess reliably. For the first time, we compared chromatic confocal microscopy (CCM) with scanning electron microscopy, atomic force microscopy (AFM) and focus-variation microscopy (FVM) to characterize surfaces of variable roughness after FSL cutting. The small area allowed by AFM hinders conclusive roughness analysis, especially with irregular cuts. FVM does not always differentiate between smooth and rough surfaces. Finally, CCM allows analysis of large surfaces and differentiates between surface states. PMID:29188095
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
PB-AM: An open-source, fully analytical linear poisson-boltzmann solver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui
2016-11-02
We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less
NASA Astrophysics Data System (ADS)
Witzel, Gunther; Lu, Jessica R.; Ghez, Andrea M.; Martinez, Gregory D.; Fitzgerald, Michael P.; Britton, Matthew; Sitarski, Breann N.; Do, Tuan; Campbell, Randall D.; Service, Maxwell; Matthews, Keith; Morris, Mark R.; Becklin, E. E.; Wizinowich, Peter L.; Ragland, Sam; Doppmann, Greg; Neyman, Chris; Lyke, James; Kassis, Marc; Rizzi, Luca; Lilley, Scott; Rampy, Rachel
2016-07-01
General relativity can be tested in the strong gravity regime by monitoring stars orbiting the supermassive black hole at the Galactic Center with adaptive optics. However, the limiting source of uncertainty is the spatial PSF variability due to atmospheric anisoplanatism and instrumental aberrations. The Galactic Center Group at UCLA has completed a project developing algorithms to predict PSF variability for Keck AO images. We have created a new software package (AIROPA), based on modified versions of StarFinder and Arroyo, that takes atmospheric turbulence profiles, instrumental aberration maps, and images as inputs and delivers improved photometry and astrometry on crowded fields. This software package will be made publicly available soon.
WannierTools: An open-source software package for novel topological materials
NASA Astrophysics Data System (ADS)
Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.
2018-03-01
We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
Wang, Anliang; Yan, Xiaolong; Wei, Zhijun
2018-04-27
This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.
A User-Friendly Software Package for HIFU Simulation
NASA Astrophysics Data System (ADS)
Soneson, Joshua E.
2009-04-01
A freely-distributed, MATLAB (The Mathworks, Inc., Natick, MA)-based software package for simulating axisymmetric high-intensity focused ultrasound (HIFU) beams and their heating effects is discussed. The package (HIFU_Simulator) consists of a propagation module which solves the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation and a heating module which solves Pennes' bioheat transfer (BHT) equation. The pressure, intensity, heating rate, temperature, and thermal dose fields are computed, plotted, the output is released to the MATLAB workspace for further user analysis or postprocessing.
Comparison of requirements and capabilities of major multipurpose software packages.
Igo, Robert P; Schnell, Audrey H
2012-01-01
The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.
Software package for modeling spin-orbit motion in storage rings
NASA Astrophysics Data System (ADS)
Zyuzin, D. V.
2015-12-01
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.
Reference datasets for bioequivalence trials in a two-group parallel design.
Fuglsang, Anders; Schütz, Helmut; Labes, Detlew
2015-03-01
In order to help companies qualify and validate the software used to evaluate bioequivalence trials with two parallel treatment groups, this work aims to define datasets with known results. This paper puts a total 11 datasets into the public domain along with proposed consensus obtained via evaluations from six different software packages (R, SAS, WinNonlin, OpenOffice Calc, Kinetica, EquivTest). Insofar as possible, datasets were evaluated with and without the assumption of equal variances for the construction of a 90% confidence interval. Not all software packages provide functionality for the assumption of unequal variances (EquivTest, Kinetica), and not all packages can handle datasets with more than 1000 subjects per group (WinNonlin). Where results could be obtained across all packages, one showed questionable results when datasets contained unequal group sizes (Kinetica). A proposal is made for the results that should be used as validation targets.
Painting a picture across the landscape with ModelMap
Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino
2017-01-01
Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...
"FluSpec": A Simulated Experiment in Fluorescence Spectroscopy
ERIC Educational Resources Information Center
Bigger, Stephen W.; Bigger, Andrew S.; Ghiggino, Kenneth P.
2014-01-01
The "FluSpec" educational software package is a fully contained tutorial on the technique of fluorescence spectroscopy as well as a simulator on which experiments can be performed. The procedure for each of the experiments is also contained within the package along with example analyses of results that are obtained using the software.
Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)
ERIC Educational Resources Information Center
Yavuz, Guler; Hambleton, Ronald K.
2017-01-01
Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…
Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.
ERIC Educational Resources Information Center
Senn, Gary J.; Smyth, Thomas J. C.
Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…
Virginia Transit Performance Evaluation Package (VATPEP).
DOT National Transportation Integrated Search
1987-01-01
The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...
Cooperative Work and Sustainable Scientific Software Practices in R
NASA Astrophysics Data System (ADS)
Weber, N.
2013-12-01
Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.
NASA Technical Reports Server (NTRS)
Jumper, Judith K.
1994-01-01
The Laser Velocimeter Data Acquisition System (LVDAS) in the Langley 14- by 22-Foot Tunnel is controlled by a comprehensive software package. The software package was designed to control the data acquisition process during wind tunnel tests which employ a laser velocimeter measurement system. This report provides detailed explanations on how to configure and operate the LVDAS system to acquire laser velocimeter and static wind tunnel data.
Software engineering the mixed model for genome-wide association studies on large samples.
Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J
2009-11-01
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.
Space-Shuttle Emulator Software
NASA Technical Reports Server (NTRS)
Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram;
2007-01-01
A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tanpitukpongse, Teerath P.; Mazurowski, Maciej A.; Ikhena, John; Petrella, Jeffrey R.
2016-01-01
Background and Purpose To assess prognostic efficacy of individual versus combined regional volumetrics in two commercially-available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer's disease. Materials and Methods Data was obtained through the Alzheimer's Disease Neuroimaging Initiative. 192 subjects (mean age 74.8 years, 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1WI MRI sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant® and Neuroreader™. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated using a univariable approach employing individual regional brain volumes, as well as two multivariable approaches (multiple regression and random forest), combining multiple volumes. Results On univariable analysis of 11 NeuroQuant® and 11 Neuroreader™ regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69 NeuroQuant®, 0.68 Neuroreader™), and was not significantly different (p > 0.05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63 logistic regression, 0.60 random forest NeuroQuant®; 0.65 logistic regression, 0.62 random forest Neuroreader™). Conclusion Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer's disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in MCI, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. PMID:28057634
Comparison of Perfusion CT Software to Predict the Final Infarct Volume After Thrombectomy.
Austein, Friederike; Riedel, Christian; Kerby, Tina; Meyne, Johannes; Binder, Andreas; Lindner, Thomas; Huhndorf, Monika; Wodarg, Fritz; Jansen, Olav
2016-09-01
Computed tomographic perfusion represents an interesting physiological imaging modality to select patients for reperfusion therapy in acute ischemic stroke. The purpose of our study was to determine the accuracy of different commercial perfusion CT software packages (Philips (A), Siemens (B), and RAPID (C)) to predict the final infarct volume (FIV) after mechanical thrombectomy. Single-institutional computed tomographic perfusion data from 147 mechanically recanalized acute ischemic stroke patients were postprocessed. Ischemic core and FIV were compared about thrombolysis in cerebral infarction (TICI) score and time interval to reperfusion. FIV was measured at follow-up imaging between days 1 and 8 after stroke. In 118 successfully recanalized patients (TICI 2b/3), a moderately to strongly positive correlation was observed between ischemic core and FIV. The highest accuracy and best correlation are shown in early and fully recanalized patients (Pearson r for A=0.42, B=0.64, and C=0.83; P<0.001). Bland-Altman plots and boxplots demonstrate smaller ranges in package C than in A and B. Significant differences were found between the packages about over- and underestimation of the ischemic core. Package A, compared with B and C, estimated more than twice as many patients with a malignant stroke profile (P<0.001). Package C best predicted hypoperfusion volume in nonsuccessfully recanalized patients. Our study demonstrates best accuracy and approximation between the results of a fully automated software (RAPID) and FIV, especially in early and fully recanalized patients. Furthermore, this software package overestimated the FIV to a significantly lower degree and estimated a malignant mismatch profile less often than other software. © 2016 American Heart Association, Inc.
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang
2016-12-23
A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coltrin, M.E.; Kee, R.J.; Rupley, F.M.
1991-07-01
Heterogeneous reaction at the interface between a solid surface and adjacent gas is central to many chemical processes. Our purpose for developing the software package SURFACE CHEMKIN was motivated by our need to understand the complex surface chemistry in chemical vapor deposition systems involving silicon, silicon nitride, and gallium arsenide. However, we have developed the approach and implemented the software in a general setting. Thus, we expect it will find use in such diverse applications as chemical vapor deposition, chemical etching, combustion of solids, and catalytic processes, and for a wide range of chemical systems. We believe that it providesmore » a powerful capability to help model, understand, and optimize important industrial and research chemical processes. The SURFACE CHEMKIN software is designed to work in conjunction with the CHEMKIN-2 software, which handles the chemical kinetics in the gas phase. It may also be used in conjunction with the Transport Property Package, which provides information about molecular diffusion. Thus, these three packages provide a foundation on which a user can build applications software to analyze gas-phase and heterogeneous chemistry in flowing systems. These packages should not be considered programs'' in the ordinary sense. That is, they are not designed to accept input, solve a particular problem, and report the answer. Instead, they are software tools intended to help a user work efficiently with large systems of chemical reactions and develop Fortran representations of systems of equations that define a particular problem. It is up the user to solve the problem and interpret the answer. 11 refs., 15 figs., 5 tabs.« less
Comparison of four software packages for CT lung volumetry in healthy individuals.
Nemec, Stefan F; Molinari, Francesco; Dufresne, Valerie; Gosset, Natacha; Silva, Mario; Bankier, Alexander A
2015-06-01
To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. • Volumetric differences, assessed by different CTLV software, are small but statistically significant. • Volumetric differences are smaller at TLC than at MIC and FRC. • Volumetric differences rarely exceed spirometric repeatability thresholds at MIC and FRC. • Differences between CTLV measurements should be interpreted based on comparison of absolute differences. • MLD increases with decreasing volumes, and is larger in lower compared to upper lobes.
Flight simulation software at NASA Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Norlin, Ken A.
1995-01-01
The NASA Dryden Flight Research Center has developed a versatile simulation software package that is applicable to a broad range of fixed-wing aircraft. This package has evolved in support of a variety of flight research programs. The structure is designed to be flexible enough for use in batch-mode, real-time pilot-in-the-loop, and flight hardware-in-the-loop simulation. Current simulations operate on UNIX-based platforms and are coded with a FORTRAN shell and C support routines. This paper discusses the features of the simulation software design and some basic model development techniques. The key capabilities that have been included in the simulation are described. The NASA Dryden simulation software is in use at other NASA centers, within industry, and at several universities. The straightforward but flexible design of this well-validated package makes it especially useful in an engineering environment.
The Ettention software package.
Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp
2016-02-01
We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.
PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.
Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa
2017-06-05
We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Pandey, Palak; Kunte, Pravin D.
2016-10-01
This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.
Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.
A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.
ERIC Educational Resources Information Center
McConkie, George W.; And Others
A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…
Overview of Current Activities in Combustion Instability
2015-10-02
and avoid liquid rocket engine combustion stability problems Approach: 1) Develop a SOA combustion stability software package called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion
Sigma 2 Graphic Display Software Program Description
NASA Technical Reports Server (NTRS)
Johnson, B. T.
1973-01-01
A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.
Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?
ERIC Educational Resources Information Center
Snell, Meggan; Yatsenko, Olga
2002-01-01
Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)
Object-oriented design of medical imaging software.
Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R
1994-01-01
A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.
StreamThermal: A software package for calculating thermal metrics from stream temperature data
Tsang, Yin-Phan; Infante, Dana M.; Stewart, Jana S.; Wang, Lizhu; Tingly, Ralph; Thornbrugh, Darren; Cooper, Arthur; Wesley, Daniel
2016-01-01
Improving quality and better availability of continuous stream temperature data allows natural resource managers, particularly in fisheries, to understand associations between different characteristics of stream thermal regimes and stream fishes. However, there is no convenient tool to efficiently characterize multiple metrics reflecting stream thermal regimes with the increasing amount of data. This article describes a software program packaged as a library in R to facilitate this process. With this freely-available package, users will be able to quickly summarize metrics that describe five categories of stream thermal regimes: magnitude, variability, frequency, timing, and rate of change. The installation and usage instruction of this package, the definition of calculated thermal metrics, as well as the output format from the package are described, along with an application showing the utility for multiple metrics. We believe this package can be widely utilized by interested stakeholders and greatly assist more studies in fisheries.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
''Do-it-yourself'' software program calculates boiler efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-03-01
An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
NASA Astrophysics Data System (ADS)
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
Automated data collection in single particle electron microscopy
Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget
2016-01-01
Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944
Defense AT and L. Volume 42, Number 1
2013-02-01
Agnish The U.S. Army late last year began equipping brigade combat teams with its first package of radios, satellite systems, software applications...Army’s first package of radios, satellite systems, software applications, smartphone-like devices, and other network components that provide integrated... satellite communications, intelligence, mission command applications, and the integration of C4ISR equip- ment onto various vehicle platforms. This
ERIC Educational Resources Information Center
Gagne, Phill; Furlow, Carolyn; Ross, Terris
2009-01-01
In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…
Software for Teaching about AIDS & Sex: A Critical Review of Products. A MicroSIFT Report.
ERIC Educational Resources Information Center
Weaver, Dave
This document contains critical reviews of 10 microcomputer software packages and two interactive videodisc products designed for use in teaching about Acquired Immune Deficiency Syndrome (AIDS) and sex at the secondary school level and above. Each package was reviewed by one or two secondary school health teachers and by a staff member from the…
User's manual for the VAX-Gerber link software package. Revision 1. 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isobe, G.W.
1985-10-01
This manual provides a user the information necessary to run the VAX-Gerber link software package. It is expected that the user already knows how to login to the VAX, and is familiar with the Gerber Photo Plotter. It is also highly desirable that the user be familiar with the full screen editor on the VAX, EDT.
Sampling and sensitivity analyses tools (SaSAT) for computational modelling
Hoare, Alexander; Regan, David G; Wilson, David P
2008-01-01
SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.
PsyToolkit: a software package for programming psychological experiments using Linux.
Stoet, Gijsbert
2010-11-01
PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
Software package for modeling spin–orbit motion in storage rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de
2015-12-15
A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less
ERIC Educational Resources Information Center
Levy, Roy
2010-01-01
SEMModComp, a software package for conducting likelihood ratio tests for mean and covariance structure modeling is described. The package is written in R and freely available for download or on request.
The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8
NASA Astrophysics Data System (ADS)
Bancheri, Marialaura; Serafin, Francesco; Bottazzi, Michele; Abera, Wuletawu; Formetta, Giuseppe; Rigon, Riccardo
2018-06-01
This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat.
Advanced Software Development Workstation Project
NASA Technical Reports Server (NTRS)
Lee, Daniel
1989-01-01
The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.
The STARLINK software collection
NASA Astrophysics Data System (ADS)
Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.
1993-12-01
A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.
1993-05-01
limitation of the software package would not allow DATE/I’ME FREQUENCY (kHz) the program to run over 2359 to 0001 UT. This was 18.1 19.0 21.4 24.0...Capability (LWPC), software package devel- oped at NOSC (FERGUSON et al 1989) and adapted by us to the Macintosh personal computer. We find that this... software works very well. Our investigations are to I evaluate and devise geophysical models to be used with . LWPC in assessing VLF communications and
Use of the Femtosecond Lasers in Ophthalmology
NASA Astrophysics Data System (ADS)
Roszkowska, Anna M.; Urso, Mario; Signorino, Alberto; Aragona, Pasquale
2018-01-01
Femtosecond laser (FSL) is an infrared laser with a wavelength of 1053 nm. FS laser works producing photodisruption or photoionization of the optically transparent tissue such as cornea. Currently FS lasers have a wide range of applications in ophthalmic surgery. They are used above all in corneal surgery in refractive procedures and keratoplasty, and recently in cataract surgery. The use of the FSL in corneal refractive surgery includes LASIK flap creation, astigmatic keratotomy, Femtosecond Lenticule Extraction (FLEx), Small Incision Lenticule Extraction (SMILE) and channels creation for implantation of the intrastromal corneal rings. As to the corneal grafting, the FS lasers are used in laser-assisted anterior and posterior lamellar keratoplasty and customized trephination in the penetrating keratoplasty. FS Laser Assisted Cataract Surgery (FLACS) includes capsulorrhexis and nuclear fragmentation that enhance safety and efficacy of the procedure.
Duarte, Gabriela Frois; Rosado, Alexandre Soares; Seldin, Lucy; de Araujo, Welington; van Elsas, Jan Dirk
2001-01-01
The selective effects of sulfur-containing hydrocarbons, with respect to changes in bacterial community structure and selection of desulfurizing organisms and genes, were studied in soil. Samples taken from a polluted field soil (A) along a concentration gradient of sulfurous oil and from soil microcosms treated with dibenzothiophene (DBT)-containing petroleum (FSL soil) were analyzed. Analyses included plate counts of total bacteria and of DBT utilizers, molecular community profiling via soil DNA-based PCR-denaturing gradient gel electrophoresis (PCR-DGGE), and detection of genes that encode enzymes involved in the desulfurization of hydrocarbons, i.e., dszA, dszB, and dszC.Data obtained from the A soil showed no discriminating effects of oil levels on the culturable bacterial numbers on either medium used. Generally, counts of DBT degraders were 10- to 100-fold lower than the total culturable counts. However, PCR-DGGE showed that the numbers of bands detected in the molecular community profiles decreased with increasing oil content of the soil. Analysis of the sequences of three prominent bands of the profiles generated with the highly polluted soil samples suggested that the underlying organisms were related to Actinomyces sp., Arthrobacter sp., and a bacterium of uncertain affiliation. dszA, dszB, and dszC genes were present in all A soil samples, whereas a range of unpolluted soils gave negative results in this analysis. Results from the study of FSL soil revealed minor effects of the petroleum-DBT treatment on culturable bacterial numbers and clear effects on the DBT-utilizing communities. The molecular community profiles were largely stable over time in the untreated soil, whereas they showed a progressive change over time following treatment with DBT-containing petroleum. Direct PCR assessment revealed the presence of dszB-related signals in the untreated FSL soil and the apparent selection of dszA- and dszC-related sequences by the petroleum-DBT treatment. PCR-DGGE applied to sequential enrichment cultures in DBT-containing sulfur-free basal salts medium prepared from the A and treated FSL soils revealed the selection of up to 10 distinct bands. Sequencing a subset of these bands provided evidence for the presence of organisms related to Pseudomonas putida, a Pseudomonas sp., Stenotrophomonas maltophilia, and Rhodococcus erythropolis. Several of 52 colonies obtained from the A and FSL soils on agar plates with DBT as the sole sulfur source produced bands that matched the migration of bands selected in the enrichment cultures. Evidence for the presence of dszB in 12 strains was obtained, whereas dszA and dszC genes were found in only 7 and 6 strains, respectively. Most of the strains carrying dszA or dszC were classified as R. erythropolis related, and all revealed the capacity to desulfurize DBT. A comparison of 37 dszA sequences, obtained via PCR from the A and FSL soils, from enrichments of these soils, and from isolates, revealed the great similarity of all sequences to the canonical (R. erythropolis strain IGTS8) dszA sequence and a large degree of internal conservation. The 37 sequences recovered were grouped in three clusters. One group, consisting of 30 sequences, was minimally 98% related to the IGTS8 sequence, a second group of 2 sequences was slightly different, and a third group of 5 sequences was 95% similar. The first two groups contained sequences obtained from both soil types and enrichment cultures (including isolates), but the last consisted of sequences obtained directly from the polluted A soil. PMID:11229891
The GRIDView Visualization Package
NASA Astrophysics Data System (ADS)
Kent, B. R.
2011-07-01
Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.
NASA Astrophysics Data System (ADS)
Sardina, V.
2012-12-01
The US Tsunami Warning Centers (TWCs) have traditionally generated their tsunami message products primarily as blocks of text then tagged with headers that identify them on each particular communications' (comms) circuit. Each warning center has a primary area of responsibility (AOR) within which it has an authoritative role regarding parameters such as earthquake location and magnitude. This means that when a major tsunamigenic event occurs the other warning centers need to quickly access the earthquake parameters issued by the authoritative warning center before issuing their message products intended for customers in their own AOR. Thus, within the operational context of the TWCs the scientists on duty have an operational need to access the information contained in the message products issued by other warning centers as quickly as possible. As a solution to this operational problem we designed and implemented a C++ software package that allows scanning for and parsing the entire suite of tsunami message products issued by the Pacific Tsunami Warning Center (PTWC), the West Coast and Alaska Tsunami Warning Center (WCATWC), and the Japan Meteorological Agency (JMA). The scanning and parsing classes composing the resulting C++ software package allow parsing both non-official message products(observatory messages) routinely issued by the TWCs, and all official tsunami message products such as tsunami advisories, watches, and warnings. This software package currently allows scientists on duty at the PTWC to automatically retrieve the parameters contained in tsunami messages issued by WCATWC, JMA, or PTWC itself. Extension of the capabilities of the classes composing the software package would make it possible to generate XML and CAP compliant versions of the TWCs' message products until new messaging software natively adds this capabilities. Customers who receive the TWCs' tsunami message products could also use the package to automatically retrieve information from messages sent via any text-based communications' circuit currently used by the TWCs to disseminate their tsunami message products.
21 CFR 801.50 - Labeling requirements for stand-alone software.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Labeling requirements for stand-alone software....50 Labeling requirements for stand-alone software. (a) Stand-alone software that is not distributed... in packaged form, stand-alone software regulated as a medical device must provide its unique device...
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Tanpitukpongse, T P; Mazurowski, M A; Ikhena, J; Petrella, J R
2017-03-01
Alzheimer disease is a prevalent neurodegenerative disease. Computer assessment of brain atrophy patterns can help predict conversion to Alzheimer disease. Our aim was to assess the prognostic efficacy of individual-versus-combined regional volumetrics in 2 commercially available brain volumetric software packages for predicting conversion of patients with mild cognitive impairment to Alzheimer disease. Data were obtained through the Alzheimer's Disease Neuroimaging Initiative. One hundred ninety-two subjects (mean age, 74.8 years; 39% female) diagnosed with mild cognitive impairment at baseline were studied. All had T1-weighted MR imaging sequences at baseline and 3-year clinical follow-up. Analysis was performed with NeuroQuant and Neuroreader. Receiver operating characteristic curves assessing the prognostic efficacy of each software package were generated by using a univariable approach using individual regional brain volumes and 2 multivariable approaches (multiple regression and random forest), combining multiple volumes. On univariable analysis of 11 NeuroQuant and 11 Neuroreader regional volumes, hippocampal volume had the highest area under the curve for both software packages (0.69, NeuroQuant; 0.68, Neuroreader) and was not significantly different ( P > .05) between packages. Multivariable analysis did not increase the area under the curve for either package (0.63, logistic regression; 0.60, random forest NeuroQuant; 0.65, logistic regression; 0.62, random forest Neuroreader). Of the multiple regional volume measures available in FDA-cleared brain volumetric software packages, hippocampal volume remains the best single predictor of conversion of mild cognitive impairment to Alzheimer disease at 3-year follow-up. Combining volumetrics did not add additional prognostic efficacy. Therefore, future prognostic studies in mild cognitive impairment, combining such tools with demographic and other biomarker measures, are justified in using hippocampal volume as the only volumetric biomarker. © 2017 by American Journal of Neuroradiology.
DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS
The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...
1988-03-01
PACKAGE BODY ) TLCSC P661 (CATALOG #P106-0) This package contains the CAMP parts required to do the vaypoint steering portion of navigation. The...3.3.4.1.6 PROCESSING The following describes the processing performed by this part: package body WaypointSteering is package body ...Steering_Vector_Operations is separate; package body Steering_Vector_Operations_with_Arcsin is separate; procedure Compute Turn_Angle_and Direction (UnitNormal C
Rastgou, Fereydoon; Shojaeifard, Maryam; Amin, Ahmad; Ghaedian, Tahereh; Firoozabadi, Hasan; Malek, Hadi; Yaghoobi, Nahid; Bitarafan-Rajabi, Ahmad; Haghjoo, Majid; Amouzadeh, Hedieh; Barati, Hossein
2014-12-01
Recently, the phase analysis of gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) has become feasible via several software packages for the evaluation of left ventricular mechanical dyssynchrony. We compared two quantitative software packages, quantitative gated SPECT (QGS) and Emory cardiac toolbox (ECTb), with tissue Doppler imaging (TDI) as the conventional method for the evaluation of left ventricular mechanical dyssynchrony. Thirty-one patients with severe heart failure (ejection fraction ≤35%) and regular heart rhythm, who referred for gated-SPECT MPI, were enrolled. TDI was performed within 3 days after MPI. Dyssynchrony parameters derived from gated-SPECT MPI were analyzed by QGS and ECTb and were compared with the Yu index and septal-lateral wall delay measured by TDI. QGS and ECTb showed a good correlation for assessment of phase histogram bandwidth (PHB) and phase standard deviation (PSD) (r = 0.664 and r = 0.731, P < .001, respectively). However, the mean value of PHB and PSD by ECTb was significantly higher than that of QGS. No significant correlation was found between ECTb and QGS and the Yu index. Nevertheless, PHB, PSD, and entropy derived from QGS revealed a significant (r = 0.424, r = 0.478, r = 0.543, respectively; P < .02) correlation with septal-lateral wall delay. Despite a good correlation between QGS and ECTb software packages, different normal cut-off values of PSD and PHB should be defined for each software package. There was only a modest correlation between phase analysis of gated-SPECT MPI and TDI data, especially in the population of heart failure patients with both narrow and wide QRS complex.
An image-processing software package: UU and Fig for optical metrology applications
NASA Astrophysics Data System (ADS)
Chen, Lujie
2013-06-01
Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.
MOSAIC: Software for creating mosaics from collections of images
NASA Technical Reports Server (NTRS)
Varosi, F.; Gezari, D. Y.
1992-01-01
We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.
R-Based Software for the Integration of Pathway Data into Bioinformatic Algorithms
Kramer, Frank; Bayerlová, Michaela; Beißbarth, Tim
2014-01-01
Putting new findings into the context of available literature knowledge is one approach to deal with the surge of high-throughput data results. Furthermore, prior knowledge can increase the performance and stability of bioinformatic algorithms, for example, methods for network reconstruction. In this review, we examine software packages for the statistical computing framework R, which enable the integration of pathway data for further bioinformatic analyses. Different approaches to integrate and visualize pathway data are identified and packages are stratified concerning their features according to a number of different aspects: data import strategies, the extent of available data, dependencies on external tools, integration with further analysis steps and visualization options are considered. A total of 12 packages integrating pathway data are reviewed in this manuscript. These are supplemented by five R-specific packages for visualization and six connector packages, which provide access to external tools. PMID:24833336
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. K.
FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
ERIC Educational Resources Information Center
Risley, John, Ed.
1988-01-01
Compares the features of the sonic rangers available from HRM Software, MICROMEASUREMENTS, NAGAWTIS Software Research, and PASCO Scientific for demonstrations and experiments in mechanics. Presents the advantages of the sonic rangers and the typical graphics displayed by each software package. (YP)
Mesoscale and severe storms (Mass) data management and analysis system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.; Dickerson, M.
1984-01-01
Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.
Reference Gene Validation for RT-qPCR, a Note on Different Available Software Packages
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
Background An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. Results 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. Conclusions This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use. PMID:25825906
Reference gene validation for RT-qPCR, a note on different available software packages.
De Spiegelaere, Ward; Dern-Wieloch, Jutta; Weigel, Roswitha; Schumacher, Valérie; Schorle, Hubert; Nettersheim, Daniel; Bergmann, Martin; Brehm, Ralph; Kliesch, Sabine; Vandekerckhove, Linos; Fink, Cornelia
2015-01-01
An appropriate normalization strategy is crucial for data analysis from real time reverse transcription polymerase chain reactions (RT-qPCR). It is widely supported to identify and validate stable reference genes, since no single biological gene is stably expressed between cell types or within cells under different conditions. Different algorithms exist to validate optimal reference genes for normalization. Applying human cells, we here compare the three main methods to the online available RefFinder tool that integrates these algorithms along with R-based software packages which include the NormFinder and GeNorm algorithms. 14 candidate reference genes were assessed by RT-qPCR in two sample sets, i.e. a set of samples of human testicular tissue containing carcinoma in situ (CIS), and a set of samples from the human adult Sertoli cell line (FS1) either cultured alone or in co-culture with the seminoma like cell line (TCam-2) or with equine bone marrow derived mesenchymal stem cells (eBM-MSC). Expression stabilities of the reference genes were evaluated using geNorm, NormFinder, and BestKeeper. Similar results were obtained by the three approaches for the most and least stably expressed genes. The R-based packages NormqPCR, SLqPCR and the NormFinder for R script gave identical gene rankings. Interestingly, different outputs were obtained between the original software packages and the RefFinder tool, which is based on raw Cq values for input. When the raw data were reanalysed assuming 100% efficiency for all genes, then the outputs of the original software packages were similar to the RefFinder software, indicating that RefFinder outputs may be biased because PCR efficiencies are not taken into account. This report shows that assay efficiency is an important parameter for reference gene validation. New software tools that incorporate these algorithms should be carefully validated prior to use.
Attributes and Behaviors of Performance-Centered Systems.
ERIC Educational Resources Information Center
Gery, Gloria
1995-01-01
Examines attributes, characteristics, and behaviors of performance-centered software packages that are emerging in the consumer software marketplace and compares them with large-scale systems software being designed by internal information systems staffs and vendors of large-scale software designed for financial, manufacturing, processing, and…
WGCNA: an R package for weighted correlation network analysis.
Langfelder, Peter; Horvath, Steve
2008-12-29
Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.
WGCNA: an R package for weighted correlation network analysis
Langfelder, Peter; Horvath, Steve
2008-01-01
Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008
Yavorska, Olena O; Burgess, Stephen
2017-12-01
MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MANN, F.M.
Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.
An Integrated Research Program for the Modeling, Analysis and Control of Aerospace Systems
1992-03-03
Fabiano, Jr. - Brown University Mitchell Feigenbaum - Rockefeller University Elena Fernandez - Institudo de Desarrollo Techologico, para la Industria...system. The system runs under DEC Ultrix; we have installed the GKS graphics system and language compilers (FORTRAN and C). The DELIGHT.MIMO software ...which links a sophisticated non-smooth optimization package to some linear system software , is on the system. The package was kindly furnished by
An Integrated Research Program for the Modeling, Analysis and Control of Aerospace Systems
1992-03-03
Mitchell Feigenbaum - Rockefeller University Elena Fernandez - Institudo de Desarrollo Techologico, para la Industria Quimica Wilfred M. Greenlee...Ultrix; we have installed the GKS graphics system and language compilers (FORTRAN and C). The DELIGHT.MIMO software , which links a sophisticated non...smooth optimization package to some linear system software , is on the system. The package was kindly furnished by Professor E. Polak, Electrical and
Advanced Simulation in Undergraduate Pilot Training: Automatic Instructional System
1975-10-01
an addressable reel-to--reel audio tape recorder, a random access audio memory drum , and an interactive software package which permits the user to...audio memory drum , and an interactive software package which permits the user to develop preptogtahmed exercises. Figure 2 illustrates overall...Data Recprding System consists of two elements; an overlay program which performs the real-time sampling of specified variables and stores data to disc
Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.
ERIC Educational Resources Information Center
Lovell, Ashley C., Comp.
Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…
Sustaining Software-Intensive Systems
2006-05-01
2.2 Multi- Service Operational Test and Evaluation .......................................4 2.3 Stable Software Baseline...or equivalent document • completed Multi- Service Operational Test and Evaluation (MOT&E) for the potential production software package (or OT&E if...not multi- service ) • stable software production baseline • complete and current software documentation • Authority to Operate (ATO) for an
Nuclear Data Online Services at Peking University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, T.S.; Guo, Z.Y.; Ye, W.G.
2005-05-24
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
Nuclear Data Online Services at Peking University
NASA Astrophysics Data System (ADS)
Fan, T. S.; Guo, Z. Y.; Ye, W. G.; Liu, W. L.; Liu, T. J.; Liu, C. X.; Chen, J. X.; Tang, G. Y.; Shi, Z. M.; Huang, X. L.; Chen, J. E.
2005-05-01
The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn; ...
2016-11-07
We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
Aspects on Transfer of Aided - Design Files
NASA Astrophysics Data System (ADS)
Goanta, A. M.; Anghelache, D. G.
2016-08-01
At this stage of development of hardware and software, each company that makes design software packages has a certain type of file created and customized in time to distinguish that company from its competitors. Thus today are widely known the DWG files belonging AutoCAD, IPT / IAM belonging to Inventor, PAR / ASM of Solid Edge's, PRT from the NX and so on. Behind every type of file there is a mathematical model which is common to more types of files. A specific aspect of the computer -aided design is that all softwares are working with both individual parts and assemblies, but their approach is different in that some use the same type of file both for each part and for the whole (PRT ), while others use different types of files (IPT / IAM, PAR / ASM, etc.). Another aspect of the computer -aided design is to transfer files between different companies which use different software packages or even the same software package but in different versions. Each of these situations generates distinct issues. Thus, to solve the partial reading by a project different from the native one, transfer files of STEP and IGES type are used
Shi, Xu; Barnes, Robert O; Chen, Li; Shajahan-Haq, Ayesha N; Hilakivi-Clarke, Leena; Clarke, Robert; Wang, Yue; Xuan, Jianhua
2015-07-15
Identification of protein interaction subnetworks is an important step to help us understand complex molecular mechanisms in cancer. In this paper, we develop a BMRF-Net package, implemented in Java and C++, to identify protein interaction subnetworks based on a bagging Markov random field (BMRF) framework. By integrating gene expression data and protein-protein interaction data, this software tool can be used to identify biologically meaningful subnetworks. A user friendly graphic user interface is developed as a Cytoscape plugin for the BMRF-Net software to deal with the input/output interface. The detailed structure of the identified networks can be visualized in Cytoscape conveniently. The BMRF-Net package has been applied to breast cancer data to identify significant subnetworks related to breast cancer recurrence. The BMRF-Net package is available at http://sourceforge.net/projects/bmrfcjava/. The package is tested under Ubuntu 12.04 (64-bit), Java 7, glibc 2.15 and Cytoscape 3.1.0. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tubiana, Luca; Polles, Guido; Orlandini, Enzo; Micheletti, Cristian
2018-06-07
The KymoKnot software package and web server identifies and locates physical knots or proper knots in a series of polymer conformations. It is mainly intended as an analysis tool for trajectories of linear or circular polymers, but it can be used on single instances too, e.g. protein structures in PDB format. A key element of the software package is the so-called minimally interfering chain closure algorithm that is used to detect physical knots in open chains and to locate the knotted region in both open and closed chains. The web server offers a user-friendly graphical interface that identifies the knot type and highlights the knotted region on each frame of the trajectory, which the user can visualize interactively from various viewpoints. The dynamical evolution of the knotted region along the chain contour is presented as a kymograph. All data can be downloaded in text format. The KymoKnot package is licensed under the BSD 3-Clause licence. The server is publicly available at http://kymoknot.sissa.it/kymoknot/interactive.php .
Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments
ERIC Educational Resources Information Center
Wang, Shuo; Wang, Jing; Gao, Yanjing
2017-01-01
An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…
Increasing Accessibility by Pooling Digital Resources
ERIC Educational Resources Information Center
Cushion, Steve
2004-01-01
There are now many CALL authoring packages that can create interactive websites and a large number of language teachers are writing materials for the whole range of such packages. Currently, each product stores its data in different formats thus hindering interoperability, pooling of digital resources and moving between software packages based in…
Multiple-Group Analysis Using the sem Package in the R System
ERIC Educational Resources Information Center
Evermann, Joerg
2010-01-01
Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…
LANZ: Software solving the large sparse symmetric generalized eigenproblem
NASA Technical Reports Server (NTRS)
Jones, Mark T.; Patrick, Merrell L.
1990-01-01
A package, LANZ, for solving the large symmetric generalized eigenproblem is described. The package was tested on four different architectures: Convex 200, CRAY Y-MP, Sun-3, and Sun-4. The package uses a Lanczos' method and is based on recent research into solving the generalized eigenproblem.
Quantification of phase retardation in corneal tissues using a femtosecond laser
NASA Astrophysics Data System (ADS)
Calhoun, William R.; Beylin, Alexander; Weiblinger, Richard; Ilev, Ilko
2013-03-01
The use of femtosecond lasers (FSL) in ophthalmic procedures, such as LASIK, lens replacement (cataract surgery), as well as several other treatments, is growing rapidly. The treatment effect is based on photo ablation of ocular tissues by a series of ultra-short laser pulses. However, the laser beam characteristics change dynamically due to interactions with birefringent corneal tissue, which may affect the outcome of the laser treatment. To better understand the effect the cornea has on the laser characteristics, we developed a system for measuring retardation and validated it with precise, standard phase retarders. Then we measured the phase retardation of FSLs through bovine corneas and found that there is a considerable, location dependent, variation in retardation values. This information can potentially help optimize FSL parameters to make their application in ophthalmic procedures safer and more effective.
7α-Hydroxycholesterol Elicits TLR6-Mediated Expression of IL-23 in Monocytic Cells.
Seo, Hyun Chul; Kim, Sun-Mi; Eo, Seong-Kug; Rhim, Byung-Yong; Kim, Koanhoi
2015-01-01
We investigated the question of whether 7-oxygenated cholesterol derivatives could affect inflammatory and/or immune responses in atherosclerosis by examining their effects on expression of IL-23 in monocytic cells. 7α-Hydroxycholesterol (7αOHChol) induced transcription of the TLR6 gene and elevated the level of cell surface TLR6 protein in THP-1 monocytic cells. Addition of an agonist of TLR6, FSL-1, to TLR6-expressing cells by treatment with 7αOHChol resulted in enhanced production of IL-23 and transcription of genes encoding the IL-23 subunit α (p19) and the IL-12 subunit β (p40). However, treatment with 7-ketocholesterol (7K) and 7β-hydroxycholesterol (7βOHChol) did not affect TLR6 expression, and addition of FSL-1 to cells treated with either 7K or 7βOHChol did not influence transcription of the genes. Pharmacological inhibition of ERK, Akt, or PI3K resulted in attenuated transcription of TLR6 induced by 7αOHChol as well as secretion of IL-23 enhanced by 7αOHChol plus FSL-1. Inhibition of p38 MAPK or JNK resulted in attenuated secretion of IL-23. These results indicate that a certain type of 7-oxygenated cholesterol like 7αOHChol can elicit TLR6-mediated expression of IL-23 by monocytic cells via PI3K/Akt and MAPKs pathways.
7α-Hydroxycholesterol Elicits TLR6-Mediated Expression of IL-23 in Monocytic Cells
Seo, Hyun Chul; Kim, Sun-Mi; Eo, Seong-Kug; Rhim, Byung-Yong; Kim, Koanhoi
2015-01-01
We investigated the question of whether 7-oxygenated cholesterol derivatives could affect inflammatory and/or immune responses in atherosclerosis by examining their effects on expression of IL-23 in monocytic cells. 7α-Hydroxycholesterol (7αOHChol) induced transcription of the TLR6 gene and elevated the level of cell surface TLR6 protein in THP-1 monocytic cells. Addition of an agonist of TLR6, FSL-1, to TLR6-expressing cells by treatment with 7αOHChol resulted in enhanced production of IL-23 and transcription of genes encoding the IL-23 subunit α (p19) and the IL-12 subunit β (p40). However, treatment with 7-ketocholesterol (7K) and 7β-hydroxycholesterol (7βOHChol) did not affect TLR6 expression, and addition of FSL-1 to cells treated with either 7K or 7βOHChol did not influence transcription of the genes. Pharmacological inhibition of ERK, Akt, or PI3K resulted in attenuated transcription of TLR6 induced by 7αOHChol as well as secretion of IL-23 enhanced by 7αOHChol plus FSL-1. Inhibition of p38 MAPK or JNK resulted in attenuated secretion of IL-23. These results indicate that a certain type of 7-oxygenated cholesterol like 7αOHChol can elicit TLR6-mediated expression of IL-23 by monocytic cells via PI3K/Akt and MAPKs pathways. PMID:25593648
Educational Software for Illustration of Drainage, Evapotranspiration, and Crop Yield.
ERIC Educational Resources Information Center
Khan, A. H.; And Others
1996-01-01
Describes a study that developed a software package for illustrating drainage, evapotranspiration, and crop yield as influenced by water conditions. The software is a tool for depicting water's influence on crop production in western Kansas. (DDR)
Modelling robotic systems with DADS
NASA Technical Reports Server (NTRS)
Churchill, L. W.; Sharf, I.
1993-01-01
With the appearance of general off-the-shelf software packages for the simulation of mechanical systems, modelling and simulation of mechanisms has become an easier task. The authors have recently used one such package, DADS, to model the dynamics of rigid and flexible-link robotic manipulators. In this paper, we present this overview of our learning experiences with DADS, in the hope that it will shorten the learning process for others interested in this software.
NASA Technical Reports Server (NTRS)
1997-01-01
DARcorporation developed a General Aviation CAD package through a Small Business Innovation Research contract from Langley Research Center. This affordable, user-friendly preliminary design system for General Aviation aircraft runs on the popular 486 IBM-compatible personal computers. Individuals taking the home-built approach, small manufacturers of General Aviation airplanes, as well as students and others interested in the analysis and design of aircraft are possible users of the package. The software can cut design and development time in half.
A streamlined Python framework for AT-TPC data analysis
NASA Astrophysics Data System (ADS)
Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.
2017-09-01
User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.
NASA Astrophysics Data System (ADS)
Niu, Yingli; Li, Wenqiang; Peng, Qian; Geng, Hua; Yi, Yuanping; Wang, Linjun; Nan, Guangjun; Wang, Dong; Shuai, Zhigang
2018-04-01
MOlecular MAterials Property Prediction Package (MOMAP) is a software toolkit for molecular materials property prediction. It focuses on luminescent properties and charge mobility properties. This article contains a brief descriptive introduction of key features, theoretical models and algorithms of the software, together with examples that illustrate the performance. First, we present the theoretical models and algorithms for molecular luminescent properties calculation, which includes the excited-state radiative/non-radiative decay rate constant and the optical spectra. Then, a multi-scale simulation approach and its algorithm for the molecular charge mobility are described. This approach is based on hopping model and combines with Kinetic Monte Carlo and molecular dynamics simulations, and it is especially applicable for describing a large category of organic semiconductors, whose inter-molecular electronic coupling is much smaller than intra-molecular charge reorganisation energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
KiT: a MATLAB package for kinetochore tracking.
Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J
2016-06-15
During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.
MPTinR: analysis of multinomial processing tree models in R.
Singmann, Henrik; Kellen, David
2013-06-01
We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .
ERIC Educational Resources Information Center
Science Teacher, 1988
1988-01-01
Reviews four software packages available for IBM PC or Apple II. Includes "Graphical Analysis III"; "Space Max: Space Station Construction Simulation"; "Guesstimation"; and "Genetic Engineering Toolbox." Focuses on each packages' strengths in a high school context. (CW)
Volumetric neuroimage analysis extensions for the MIPAV software package.
Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L
2007-09-15
We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.
[Microcomputer control of a LED stimulus display device].
Ohmoto, S; Kikuchi, T; Kumada, T
1987-02-01
A visual stimulus display system controlled by a microcomputer was constructed at low cost. The system consists of a LED stimulus display device, a microcomputer, two interface boards, a pointing device (a "mouse") and two kinds of software. The first software package is written in BASIC. Its functions are: to construct stimulus patterns using the mouse, to construct letter patterns (alphabet, digit, symbols and Japanese letters--kanji, hiragana, katakana), to modify the patterns, to store the patterns on a floppy disc, to translate the patterns into integer data which are used to display the patterns in the second software. The second software package, written in BASIC and machine language, controls display of a sequence of stimulus patterns in predetermined time schedules in visual experiments.
Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter
2012-09-01
Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.
LHCb Build and Deployment Infrastructure for run 2
NASA Astrophysics Data System (ADS)
Clemencic, M.; Couturier, B.
2015-12-01
After the successful run 1 of the LHC, the LHCb Core software team has taken advantage of the long shutdown to consolidate and improve its build and deployment infrastructure. Several of the related projects have already been presented like the build system using Jenkins, as well as the LHCb Performance and Regression testing infrastructure. Some components are completely new, like the Software Configuration Database (using the Graph DB Neo4j), or the new packaging installation using RPM packages. Furthermore all those parts are integrated to allow easier and quicker releases of the LHCb Software stack, therefore reducing the risk of operational errors. Integration and Regression tests are also now easier to implement, allowing to improve further the software checks.
MathBrowser: Web-Enabled Mathematical Software with Application to the Chemistry Curriculum, v 1.0
NASA Astrophysics Data System (ADS)
Goldsmith, Jack G.
1997-10-01
MathSoft: Cambridge, MA, 1996; free via ftp from www.mathsoft.com. The movement to provide computer-based applications in chemistry has come to focus on three main areas: software aimed at specific applications (drawing, simulation, data analysis, etc.), multimedia applications designed to assist in the presentation of conceptual information, and packages to be used in conjunction with a particular textbook at a specific point in the chemistry curriculum. The result is a situation where no single software package devoted to problem solving can be used across a large segment of the curriculum. Adoption of World Wide Web (WWW) technology by a manufacturer of mathematical software, however, has produced software that provides an attractive means of providing a problem-solving resource to students in courses from freshman through senior level.
MINDS: A microcomputer interactive data system for 8086-based controllers
NASA Technical Reports Server (NTRS)
Soeder, J. F.
1985-01-01
A microcomputer interactive data system (MINDS) software package for the 8086 family of microcomputers is described. To enhance program understandability and ease of code maintenance, the software is written in PL/M-86, Intel Corporation's high-level system implementation language. The MINDS software is intended to run in residence with real-time digital control software to provide displays of steady-state and transient data. In addition, the MINDS package provides classic monitor capabilities along with extended provisions for debugging an executing control system. The software uses the CP/M-86 operating system developed by Digital Research, Inc., to provide program load capabilities along with a uniform file structure for data and table storage. Finally, a library of input and output subroutines to be used with consoles equipped with PL/M-86 and assembly language is described.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
Software Tools for Development on the Peregrine System | High-Performance
Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python
Progress in the Development of a Prototype Reuse Enablement System
NASA Astrophysics Data System (ADS)
Marshall, J. J.; Downs, R. R.; Gilliam, L. J.; Wolfe, R. E.
2008-12-01
An important part of promoting software reuse is to ensure that reusable software assets are readily available to the software developers who want to use them. Through dialogs with the community, the NASA Earth Science Data Systems Software Reuse Working Group has learned that the lack of a centralized, domain- specific software repository or catalog system addressing the needs of the Earth science community is a major barrier to software reuse within the community. The Working Group has proposed the creation of such a reuse enablement system, which would provide capabilities for contributing and obtaining reusable software, to remove this barrier. The Working Group has recommended the development of a Reuse Enablement System to NASA and has performed a trade study to review systems with similar capabilities and to identify potential platforms for the proposed system. This was followed by an architecture study to determine an expeditious and cost-effective solution for this system. A number of software packages and systems were examined through both creating prototypes and examining existing systems that use the same software packages and systems. Based on the results of the architecture study, the Working Group developed a prototype of the proposed system using the recommended software package, through an iterative process of identifying needed capabilities and improving the system to provide those capabilities. Policies for the operation and maintenance of the system are being established for the system, and the identification of system policies also has contributed to the development process. Additionally, a test plan is being developed for formal testing of the prototype, to ensure that it meets all of the requirements previously developed by the Working Group. This poster summarizes the results of our work to date, focusing on the most recent activities.
FTOOLS: A general package of software to manipulate FITS files
NASA Astrophysics Data System (ADS)
Blackburn, J. K.; Shaw, R. A.; Payne, H. E.; Hayes, J. J. E.; Heasarc
1999-12-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. The FTOOLS package contains many utility programs which perform modular tasks on any FITS image or table, as well as higher-level analysis programs designed specifically for data from current and past high energy astrophysics missions. The utility programs for FITS tables are especially rich and powerful, and provide functions for presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual FTOOLS programs can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. FTOOLS development began in 1991 and has produced the main set of data analysis software for the current ASCA and RXTE space missions and for other archival sets of X-ray and gamma-ray data. The FTOOLS software package is supported on most UNIX platforms and on Windows machines. The user interface is controlled by standard parameter files that are very similar to those used by IRAF. The package is self documenting through a stand alone help task called fhelp. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
A Freeware Path to Neutron Computed Tomography
NASA Astrophysics Data System (ADS)
Schillinger, Burkhard; Craft, Aaron E.
Neutron computed tomography has become a routine method at many neutron sources due to the availability of digital detection systems, powerful computers and advanced software. The commercial packages Octopus by Inside Matters and VGStudio by Volume Graphics have been established as a quasi-standard for high-end computed tomography. However, these packages require a stiff investment and are available to the users only on-site at the imaging facility to do their data processing. There is a demand from users to have image processing software at home to do further data processing; in addition, neutron computed tomography is now being introduced even at smaller and older reactors. Operators need to show a first working tomography setup before they can obtain a budget to build an advanced tomography system. Several packages are available on the web for free; however, these have been developed for X-rays or synchrotron radiation and are not immediately useable for neutron computed tomography. Three reconstruction packages and three 3D-viewers have been identified and used even for Gigabyte datasets. This paper is not a scientific publication in the classic sense, but is intended as a review to provide searchable help to make the described packages usable for the tomography community. It presents the necessary additional preprocessing in ImageJ, some workarounds for bugs in the software, and undocumented or badly documented parameters that need to be adapted for neutron computed tomography. The result is a slightly complicated, but surprisingly high-quality path to neutron computed tomography images in 3D, but not a replacement for the even more powerful commercial software mentioned above.
Parallel computation and the Basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1992-12-16
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Parallel computation and the basis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, G.R.
1993-05-01
A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong
2017-01-01
Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
WinTRAX: A raytracing software package for the design of multipole focusing systems
NASA Astrophysics Data System (ADS)
Grime, G. W.
2013-07-01
The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.
NASA Technical Reports Server (NTRS)
Thompson, David S.; Soni, Bharat K.
2000-01-01
An integrated software package, ICEG2D, was developed to automate computational fluid dynamics (CFD) simulations for single-element airfoils with ice accretion. ICEG2D is designed to automatically perform three primary functions: (1) generating a grid-ready, surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generating a high-quality grid using the generated surface point distribution, and (3) generating the input and restart files needed to run the general purpose CFD solver NPARC. ICEG2D can be executed in batch mode using a script file or in an interactive mode by entering directives from a command line. This report summarizes activities completed in the first year of a three-year research and development program to address issues related to CFD simulations for aircraft components with ice accretion. Specifically, this document describes the technology employed in the software, the installation procedure, and a description of the operation of the software package. Validation of the geometry and grid generation modules of ICEG2D is also discussed.
PINT, A Modern Software Package for Pulsar Timing
NASA Astrophysics Data System (ADS)
Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team
2018-01-01
Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.
2015-01-01
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852
Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee
2014-09-22
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2017-04-01
Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.
Microcomputer Software Packages--Choose with Caution.
ERIC Educational Resources Information Center
Naumer, Janet Noll
1983-01-01
Briefly discusses types of software available for library and media center operations and library instruction, suggests three sources of software reviews, and describes almost 50 specific application programs available for bibliographic management, cataloging, circulation, inventory and purchasing, readability, and teaching library skills in…
ConcreteWorks v3 training/user manual (P1) : ConcreteWorks software (P2).
DOT National Transportation Integrated Search
2017-04-01
ConcreteWorks is designed to be a user-friendly software package that can help concrete : professionals optimize concrete mixture proportioning, perform a concrete thermal analysis, and : increase the chloride diffusion service life. The software pac...
SuperLab LT: Evaluation and Uses in Teaching Experimental Psychology
ERIC Educational Resources Information Center
Ragozzine, Frank
2002-01-01
I describe and evaluate SuperLab LT (Chase & Abboud, 1990), a software package that enables students to replicate classic experiments in cognitive psychology. I also discuss the package with respect to its uses in teaching an undergraduate course in Experimental Psychology. Although the package has minor flaws, SuperLab LT provides numerous…
Quality Assurance Information for R Packages "aqfig" and "M3"
R packages “aqfig" and “M3" are optional modules for use with R statistical software (http://www.r-project.org). Package “aqfig" contains functions to aid users in the preparation of publication-quality figures for the display of air quality and other environmental data (e.g., le...
Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.
ERIC Educational Resources Information Center
McArthur, David
This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…
Transit safety retrofit package development : applications requirements document.
DOT National Transportation Integrated Search
2014-05-01
This Application Requirements Document for the Transit Safety Retrofit Package (TRP) Development captures the system, hardware and software requirements towards fulfilling the technical objectives stated within the contract. To achieve the objective ...
Mass decomposition of galaxies using DECA software package
NASA Astrophysics Data System (ADS)
Mosenkov, A. V.
2014-01-01
The new DECA software package, which is designed to perform photometric analysis of the images of disk and elliptical galaxies having a regular structure, is presented. DECA is written in Python interpreted language and combines the capabilities of several widely used packages for astronomical data processing such as IRAF, SExtractor, and the GALFIT code used to perform two-dimensional decomposition of galaxy images into several photometric components (bulge+disk). DECA has the advantage that it can be applied to large samples of galaxies with different orientations with respect to the line of sight (including edge-on galaxies) and requires minimum human intervention. Examples of using the package to study a sample of simulated galaxy images and a sample of real objects are shown to demonstrate that DECA can be a reliable tool for the study of the structure of galaxies.
ERIC Educational Resources Information Center
Kimball, Jeffrey P.; And Others
1987-01-01
Describes a variety of computer software. The packages reviewed include a variety of simulations, a spread sheet, a printer driver and an alternative operating system for DBM.PCs and compatible programs. (BSR)
Development of the FITS tools package for multiple software environments
NASA Technical Reports Server (NTRS)
Pence, W. D.; Blackburn, J. K.
1992-01-01
The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valassi, A.; Clemencic, M.; Dykstra, D.
The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less
Echelle Data Reduction Cookbook
NASA Astrophysics Data System (ADS)
Clayton, Martin
This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).
Exploring Convergent Evolution to Provide a Foundation for Protein Engineering
2009-02-26
information if it does not display a currently valid OMB control number PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. RETORT DATE (DD-MM-YYYY...the DivergentSet and MotifCluster Algorithms Using support from this grant, we developed two software packages that provide key infrastructure for...software package we developed, MotifCluster," provides a novel way of detecting distantly related homologs, one of the key aims of the proposal. Unlike
YAMM - Yet Another Menu Manager
NASA Technical Reports Server (NTRS)
Mazer, Alan S.; Weidner, Richard J.
1991-01-01
Yet Another Menu Manager (YAMM) computer program an application-independent menuing package of software designed to remove much difficulty and save much time inherent in implementation of front ends of large packages of software. Provides complete menuing front end for wide variety of applications, with provisions for independence from specific types of terminals, configurations that meet specific needs of users, and dynamic creation of menu trees. Consists of two parts: description of menu configuration and body of application code. Written in C.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 24 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory. Each software review lists source, cost, ability level, subject, topic, medium of transfer, required hardware, required software,…
Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.
ERIC Educational Resources Information Center
Reed, Mary Hutchings
This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…
Software Literacy and Student Learning in the Tertiary Environment: Powerpoint and Beyond
ERIC Educational Resources Information Center
Khoo, Elaine; Hight, Craig; Cowie, Bronwen; Torrens, Rob; Ferrarelli, Lisabeth
2014-01-01
In this paper, we explore the relationship between student success in acquiring software literacy and students' broader engagement and understanding of knowledge across different disciplines. We report on the first phase of a project that examines software literacies associated with Microsoft PowerPoint as a common software package encountered and…
ERIC Educational Resources Information Center
Davies, Denise M.
1985-01-01
Discusses design, development, and use of a database to provide organization and access to a computer software collection at the University of Hawaii School of Library Studies. Field specifications, samples of report forms, and a description of the physical organization of the software collection are included. (MBR)
Software requirements flow-down and preliminary software design for the G-CLEF spectrograph
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.
2016-08-01
The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIECK, C.A.
1999-02-23
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less
ERIC Educational Resources Information Center
Journal of Chemical Education, 1989
1989-01-01
Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1990-01-01
Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)
ERIC Educational Resources Information Center
Classroom Computer Learning, 1988
1988-01-01
Provides reviews of three software packages including "MusicShapes,""For Comment," and "Colortrope," which were developed for music, writing, and science, respectively. Includes information on grade levels, publishers, hardware needed, and cost. (TW)
"Software Tools" to Improve Student Writing.
ERIC Educational Resources Information Center
Oates, Rita Haugh
1987-01-01
Reviews several software packages that analyze text readability, check for spelling and style problems, offer desktop publishing capabilities, teach interviewing skills, and teach grammar using a computer game. (SRT)
ERIC Educational Resources Information Center
Wulfson, Stephen, Ed.
1988-01-01
Reviews seven instructional software packages covering a variety of topics. Includes: "Science Square-Off"; "The Desert"; "Science Courseware: Physical Science"; "Odell Lake"; "Safety First"; "An Experience in Artificial Intelligence"; and "Master Mapper." (TW)
Selecting Really Excellent Software for Young Adults.
ERIC Educational Resources Information Center
Polly, Jean Armour
1985-01-01
This article discusses criteria of a good computer software package to aid the public librarian in the building, weeding, and maintenance of a software collection for young adults. Highlights include manuals or documentation; bells, whistles, and color; and the true test of time. (EJS)
Front-End/Gateway Software: Availability and Usefulness.
ERIC Educational Resources Information Center
Kesselman, Martin
1985-01-01
Reviews features of front-end software packages (interface between user and online system)--database selection, search strategy development, saving and downloading, hardware and software requirements, training and documentation, online systems and database accession, and costs--and discusses gateway services (user searches through intermediary…
Design and implementation of a software package to control a network of robotic observatories
NASA Astrophysics Data System (ADS)
Tuparev, G.; Nicolova, I.; Zlatanov, B.; Mihova, D.; Popova, I.; Hessman, F. V.
2006-09-01
We present a description of a reusable software package able to control a large, heterogeneous network of fully and semi-robotic observatories initially developed to run the MONET network of two 1.2 m telescopes. Special attention is given to the design of a robust, long-term observation scheduler which also allows the trading of observation time and facilities within various networks. The handling of the ``Phase I&II" project-development process, the time-accounting between complex organizational structures, and usability issues for making the package accessible not only to professional astronomers, but also to amateurs and high-school students is discussed. A simple RTML-based solution to link multiple networks is demonstrated.
A software package for interactive motor unit potential classification using fuzzy k-NN classifier.
Rasheed, Sarbast; Stashuk, Daniel; Kamel, Mohamed
2008-01-01
We present an interactive software package for implementing the supervised classification task during electromyographic (EMG) signal decomposition process using a fuzzy k-NN classifier and utilizing the MATLAB high-level programming language and its interactive environment. The method employs an assertion-based classification that takes into account a combination of motor unit potential (MUP) shapes and two modes of use of motor unit firing pattern information: the passive and the active modes. The developed package consists of several graphical user interfaces used to detect individual MUP waveforms from a raw EMG signal, extract relevant features, and classify the MUPs into motor unit potential trains (MUPTs) using assertion-based classifiers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn
We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less
ERIC Educational Resources Information Center
Sneider, Cary; DeVore, Edna
1986-01-01
Reviews software packages under these headings: (1) simulations of experiments; (2) space flight simulators; (3) planetariums; (4) space adventure games; and (5) drill and practice packages (designed for testing purposes or for helping students learn basic astronomy vocabulary). (JN)
SIRU utilization. Volume 2: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.; Whittredge, R.
1973-01-01
A complete description of the additional analysis, development and evaluation provided for the SIRU system as identified in the requirements for the SIRU utilization program is presented. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The modules are mounted in this package so that their measurement input axes form a unique symmetrical pattern that corresponds to the array of perpendiculars to the faces of a regular dodecahedron. This six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. Documentation of the additional software and software modifications required to implement the utilization capabilities includes assembly listings and flow charts
Astronomical Software Directory Service
NASA Astrophysics Data System (ADS)
Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey
1997-01-01
With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as influencing the software development. The Web interface to the search engine is provided by a gateway program written in C++ by a consultant to the project (A. Warnock).
Hippocampal Morphology in a Rat Model of Depression: The Effects of Physical Activity
Sierakowiak, Adam; Mattsson, Anna; Gómez-Galán, Marta; Feminía, Teresa; Graae, Lisette; Aski, Sahar Nikkhou; Damberg, Peter; Lindskog, Mia; Brené, Stefan; Åberg, Elin
2015-01-01
Accumulating in vivo and ex vivo evidences show that humans suffering from depression have decreased hippocampal volume and altered spine density. Moreover, physical activity has an antidepressant effect in humans and in animal models, but to what extent physical activity can affect hippocampal volume and spine numbers in a model for depression is not known. In this study we analyzed whether physical activity affects hippocampal volume and spine density by analyzing a rodent genetic model of depression, Flinders Sensitive Line Rats (FSL), with Magnetic Resonance Imaging (MRI) and ex vivo Golgi staining. We found that physical activity in the form of voluntary wheel running during 5 weeks increased hippocampal volume. Moreover, runners also had larger numbers of thin spines in the dentate gyrus. Our findings support that voluntary wheel running, which is antidepressive in FSL rats, is associated with increased hippocampal volume and spine numbers. PMID:25674191
Hippocampal morphology in a rat model of depression: the effects of physical activity.
Sierakowiak, Adam; Mattsson, Anna; Gómez-Galán, Marta; Feminía, Teresa; Graae, Lisette; Aski, Sahar Nikkhou; Damberg, Peter; Lindskog, Mia; Brené, Stefan; Åberg, Elin
2014-01-01
Accumulating in vivo and ex vivo evidences show that humans suffering from depression have decreased hippocampal volume and altered spine density. Moreover, physical activity has an antidepressant effect in humans and in animal models, but to what extent physical activity can affect hippocampal volume and spine numbers in a model for depression is not known. In this study we analyzed whether physical activity affects hippocampal volume and spine density by analyzing a rodent genetic model of depression, Flinders Sensitive Line Rats (FSL), with Magnetic Resonance Imaging (MRI) and ex vivo Golgi staining. We found that physical activity in the form of voluntary wheel running during 5 weeks increased hippocampal volume. Moreover, runners also had larger numbers of thin spines in the dentate gyrus. Our findings support that voluntary wheel running, which is antidepressive in FSL rats, is associated with increased hippocampal volume and spine numbers.
NASA Technical Reports Server (NTRS)
Tucker, T. K.
1989-01-01
Presented here are the results obtained from performance evaluation of a pair of Sigma Tau Standards Corporation Model VLBA-112 active hydrogen maser frequency standards. These masers were manufactured for the National Radio Astronomy Observatory (NRAO) for use on the Very Long Baseline Array (VLBA) project and were furnished to the Jet Propulsion Laboratory (JPL) for the purpose of these tests. Tests on the two masers were performed in the JPL Frequency Standards Laboratory (FSL) and included the characterization of output frequency stability versus environmental factors such as temperature, humidity, magnetic field, and barometric pressure. The performance tests also included the determination of phase noise and Allan variance using both FSL and Sigma Tau masers as references. All tests were conducted under controlled laboratory conditions, with only the desired environmental and operational parameters varied to determine sensitivity to external environment.
ERIC Educational Resources Information Center
Mathematics and Computer Education, 1988
1988-01-01
Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)
ERIC Educational Resources Information Center
Dwyer, Donna; And Others
1989-01-01
Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-05-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-02-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
Buying in to bioinformatics: an introduction to commercial sequence analysis software
2015-01-01
Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. PMID:25183247
Buying in to bioinformatics: an introduction to commercial sequence analysis software.
Smith, David Roy
2015-07-01
Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Introducing Python tools for magnetotellurics: MTpy
NASA Astrophysics Data System (ADS)
Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.
2013-12-01
Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).
Mathematical and Statistical Software Index. Final Report.
ERIC Educational Resources Information Center
Black, Doris E., Comp.
Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... contracts with a service bureau to provide the system, the software must be periodically updated to.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058... software required to develop and maintain an in-house automated data processing system (ADP) used to...
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. Kent; Greene, Emily A.; Pence, William
1993-05-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Fragman: an R package for fragment analysis.
Covarrubias-Pazaran, Giovanny; Diaz-Garcia, Luis; Schlautman, Brandon; Salazar, Walter; Zalapa, Juan
2016-04-21
Determination of microsatellite lengths or other DNA fragment types is an important initial component of many genetic studies such as mutation detection, linkage and quantitative trait loci (QTL) mapping, genetic diversity, pedigree analysis, and detection of heterozygosity. A handful of commercial and freely available software programs exist for fragment analysis; however, most of them are platform dependent and lack high-throughput applicability. We present the R package Fragman to serve as a freely available and platform independent resource for automatic scoring of DNA fragment lengths diversity panels and biparental populations. The program analyzes DNA fragment lengths generated in Applied Biosystems® (ABI) either manually or automatically by providing panels or bins. The package contains additional tools for converting the allele calls to GenAlEx, JoinMap® and OneMap software formats mainly used for genetic diversity and generating linkage maps in plant and animal populations. Easy plotting functions and multiplexing friendly capabilities are some of the strengths of this R package. Fragment analysis using a unique set of cranberry (Vaccinium macrocarpon) genotypes based on microsatellite markers is used to highlight the capabilities of Fragman. Fragman is a valuable new tool for genetic analysis. The package produces equivalent results to other popular software for fragment analysis while possessing unique advantages and the possibility of automation for high-throughput experiments by exploiting the power of R.
Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system
NASA Astrophysics Data System (ADS)
Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong
2018-01-01
We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.
Ground-Based GPS Sensing of Azimuthal Variations in Precipitable Water Vapor
NASA Technical Reports Server (NTRS)
Kroger, P. M.; Bar-Sever, Y. E.
1997-01-01
Current models for troposphere delay employed by GPS software packages map the total zenith delay to the line-of-sight delay of the individual satellite-receiver link under the assumption of azimuthal homogeneity. This could be a poor approximation for many sites, in particular, those located at an ocean front or next to a mountain range. We have modified the GIPSY-OASIS II software package to include a simple non-symmetric mapping function (MacMillan, 1995) which introduces two new parameters.
Development of an Ada package library
NASA Technical Reports Server (NTRS)
Burton, Bruce; Broido, Michael
1986-01-01
A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
Integrated software package STAMP for minor planets
NASA Technical Reports Server (NTRS)
Kochetova, O. M.; Shor, Viktor A.
1992-01-01
The integrated software package STAMP allowed for rapid and exact reproduction of the tables of the year-book 'Ephemerides of Minor Planets.' Additionally, STAMP solved the typical problems connected with the use of the year-book. STAMP is described. The year-book 'Ephemerides of Minor Planets' (EMP) is a publication used in many astronomical institutions around the world. It contains all the necessary information on the orbits of the numbered minor planets. Also, the astronomical coordinates are provided for each planet during its suitable observation period.
Noiseless Vlasov-Poisson simulations with linearly transformed particles
Pinto, Martin C.; Sonnendrucker, Eric; Friedman, Alex; ...
2014-06-25
We introduce a deterministic discrete-particle simulation approach, the Linearly-Transformed Particle-In-Cell (LTPIC) method, that employs linear deformations of the particles to reduce the noise traditionally associated with particle schemes. Formally, transforming the particles is justified by local first order expansions of the characteristic flow in phase space. In practice the method amounts of using deformation matrices within the particle shape functions; these matrices are updated via local evaluations of the forward numerical flow. Because it is necessary to periodically remap the particles on a regular grid to avoid excessively deforming their shapes, the method can be seen as a development ofmore » Denavit's Forward Semi-Lagrangian (FSL) scheme (Denavit, 1972 [8]). However, it has recently been established (Campos Pinto, 2012 [20]) that the underlying Linearly-Transformed Particle scheme converges for abstract transport problems, with no need to remap the particles; deforming the particles can thus be seen as a way to significantly lower the remapping frequency needed in the FSL schemes, and hence the associated numerical diffusion. To couple the method with electrostatic field solvers, two specific charge deposition schemes are examined, and their performance compared with that of the standard deposition method. Finally, numerical 1d1v simulations involving benchmark test cases and halo formation in an initially mismatched thermal sheet beam demonstrate some advantages of our LTPIC scheme over the classical PIC and FSL methods. Lastly, benchmarked test cases also indicate that, for numerical choices involving similar computational effort, the LTPIC method is capable of accuracy comparable to or exceeding that of state-of-the-art, high-resolution Vlasov schemes.« less
Mathé, Aleksander A; Husum, Henriette; El Khoury, Aram; Jiménez-Vasquez, Patricia; Gruber, Susanne H M; Wörtwein, Gitta; Nikisch, Georg; Baumann, Pierre; Agren, Hans; Andersson, Weronica; Södergren, Asa; Angelucci, Francesco
2007-09-10
Dysregulation of the monoaminergic systems is likely a sufficient but not a necessary cause of depression. A wealth of data indicates that neuropeptides, e.g., NPY, CRH, somatostatin, tachykinins and CGRP play a role in affective disorders and alcohol use/abuse. This paper focuses on NPY in etiology and pathophysiology of depression. Decreased peptide and mRNA NPY were found in hippocampus of both the genetic, e.g., the FSL strain, and environmental rat models of depression, e.g., chronic mild stress and early life maternal separation paradigms. Rat models of alcoholism also show altered NPY. Furthermore, NPY is also reduced in CSF of depressed patients. Antidepressive treatments tested so far (lithium, topiramate, SSRIs, ECT and ECS, wheel running) increase NPY selectively in rat hippocampus and in human CSF. Moreover, NPY given icv to rat has antidepressive effects which are antagonized by NPY-Y1 blockers. The data support our hypothesis that the NPY system dysregulation constitutes one of the biological underpinnings of depression and that one common mechanism of action of antidepressive treatment modalities may be effects on NPY and its receptors. In a novel paradigm, early life maternal separation was superimposed on "depressed" FSL and control rats and behavioral and brain neurochemistry changes observed in adulthood. The consequences were more deleterious in genetically vulnerable FSL. Early antidepressive treatment modulated the adult sequelae. Consequently, if these data are confirmed, the ethical and medical question that will be asked is whether it is permissible and advisable to consider prophylactically treating persons at risk.
Barr, Katie; Korchagina, Elena; Ryzhov, Ivan; Bovin, Nicolai; Henry, Stephen
2014-10-01
Monoclonal (MoAb) reagents are routinely used and are usually very reliable for the serologic determination of ABO blood types. However, the fine specificity and cross-reactivity of these reagents are often unknown, particularly against synthetic antigens used in some diagnostic assays. If nonserologic assays or very sensitive techniques other than those specifically prescribed by the manufacturer are used, then there is a risk of incorrect interpretation of results. Forty-seven MoAbs and two polyclonal ABO reagents were tested against red blood cell (RBC) kodecytes prepared with A trisaccharide, A Type 1, A Type 2, A Type 3, A Type 4, B trisaccharide, B Type 1, B Type 2, acquired B trisaccharide, and Le(a) trisaccharide function-spacer-lipid (FSL) constructs. Natural RBCs were tested in parallel. In addition these FSL constructs were printed onto paper with a desktop inkjet printer and used in a novel immunoassay that identifies reactivity through the appearance of alphanumeric characters. Mapping of MoAbs with kodecytes and printed FSL constructs revealed a series of broad recognition patterns. All ABO MoAbs tested were reactive with the RBC dominant Type 2 ABO antigens. Unexpectedly some anti-A reagents were reactive against the B Type 1 antigen, while others were poorly reactive with trisaccharide antigens. All ABO MoAbs detect the RBC dominant Type 2 ABO antigens; however, some reagents may show minor reactivity with inappropriate blood group antigens, which needs to be considered when using these reagents in alternative or highly sensitive analytic systems. © 2014 AABB.
Real time flight simulation methodology
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Cook, G.; Mcvey, E. S.
1977-01-01
Substitutional methods for digitization, input signal-dependent integrator approximations, and digital autopilot design were developed. The software framework of a simulator design package is described. Included are subroutines for iterative designs of simulation models and a rudimentary graphics package.
Chemkin-II: A Fortran chemical kinetics package for the analysis of gas-phase chemical kinetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kee, R.J.; Rupley, F.M.; Miller, J.A.
1989-09-01
This document is the user's manual for the second-generation Chemkin package. Chemkin is a software package for whose purpose is to facilitate the formation, solution, and interpretation of problems involving elementary gas-phase chemical kinetics. It provides an especially flexible and powerful tool for incorporating complex chemical kinetics into simulations of fluid dynamics. The package consists of two major software components: an Interpreter and Gas-Phase Subroutine Library. The Interpreter is a program that reads a symbolic description of an elementary, user-specified chemical reaction mechanism. One output from the Interpreter is a data file that forms a link to the Gas-Phase Subroutinemore » Library. This library is a collection of about 100 highly modular Fortran subroutines that may be called to return information on equation of state, thermodynamic properties, and chemical production rates.« less
Reference datasets for 2-treatment, 2-sequence, 2-period bioequivalence studies.
Schütz, Helmut; Labes, Detlew; Fuglsang, Anders
2014-11-01
It is difficult to validate statistical software used to assess bioequivalence since very few datasets with known results are in the public domain, and the few that are published are of moderate size and balanced. The purpose of this paper is therefore to introduce reference datasets of varying complexity in terms of dataset size and characteristics (balance, range, outlier presence, residual error distribution) for 2-treatment, 2-period, 2-sequence bioequivalence studies and to report their point estimates and 90% confidence intervals which companies can use to validate their installations. The results for these datasets were calculated using the commercial packages EquivTest, Kinetica, SAS and WinNonlin, and the non-commercial package R. The results of three of these packages mostly agree, but imbalance between sequences seems to provoke questionable results with one package, which illustrates well the need for proper software validation.
Molded underfill (MUF) encapsulation for flip-chip package: A numerical investigation
NASA Astrophysics Data System (ADS)
Azmi, M. A.; Abdullah, M. K.; Abdullah, M. Z.; Ariff, Z. M.; Saad, Abdullah Aziz; Hamid, M. F.; Ismail, M. A.
2017-07-01
This paper presents the numerical simulation of epoxy molding compound (EMC) filling in multi flip-chip packages during encapsulation process. The empty and a group flip chip packages were considered in the mold cavity in order to study the flow profile of the EMC. SOLIDWORKS software was used for three-dimensional modeling and it was incorporated into fluid analysis software namely as ANSYS FLUENT. The volume of fluid (VOF) technique was used for capturing the flow front profiles and Power Law model was applied for its rheology model. The numerical result are compared and discussed with previous experimental and it was shown a good conformity for model validation. The prediction of flow front was observed and analyzed at different filling time. The possibility and visual of void formation in the package is captured and the number of flip-chip is one factor that contributed to the void formation.
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 19 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 17 computer coordinators from educational software preview centers and evaluation agencies. The following software is listed: (1) ASK-IT, an authoring tool; (2) Balance of the Planet, an environmental…
ERIC Educational Resources Information Center
Podany, Zita
This guide lists 21 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 12 computer coordinators from educational software preview centers and evaluation agencies. These software products have been selected as not being likely to appear in the reviews produced by major software…
The Value of Open Source Software Tools in Qualitative Research
ERIC Educational Resources Information Center
Greenberg, Gary
2011-01-01
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Gunalan, Kabilar; Chaturvedi, Ashutosh; Howell, Bryan; Duchin, Yuval; Lempka, Scott F; Patriat, Remi; Sapiro, Guillermo; Harel, Noam; McIntyre, Cameron C
2017-01-01
Deep brain stimulation (DBS) is an established clinical therapy and computational models have played an important role in advancing the technology. Patient-specific DBS models are now common tools in both academic and industrial research, as well as clinical software systems. However, the exact methodology for creating patient-specific DBS models can vary substantially and important technical details are often missing from published reports. Provide a detailed description of the assembly workflow and parameterization of a patient-specific DBS pathway-activation model (PAM) and predict the response of the hyperdirect pathway to clinical stimulation. Integration of multiple software tools (e.g. COMSOL, MATLAB, FSL, NEURON, Python) enables the creation and visualization of a DBS PAM. An example DBS PAM was developed using 7T magnetic resonance imaging data from a single unilaterally implanted patient with Parkinson's disease (PD). This detailed description implements our best computational practices and most elaborate parameterization steps, as defined from over a decade of technical evolution. Pathway recruitment curves and strength-duration relationships highlight the non-linear response of axons to changes in the DBS parameter settings. Parameterization of patient-specific DBS models can be highly detailed and constrained, thereby providing confidence in the simulation predictions, but at the expense of time demanding technical implementation steps. DBS PAMs represent new tools for investigating possible correlations between brain pathway activation patterns and clinical symptom modulation.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software
NASA Astrophysics Data System (ADS)
Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.
2013-08-01
GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.
Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas
2012-01-01
1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.
NASA Technical Reports Server (NTRS)
Vu, Duc; Sandor, Michael; Agarwal, Shri
2005-01-01
CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.
Implementation of building information modeling in Malaysian construction industry
NASA Astrophysics Data System (ADS)
Memon, Aftab Hameed; Rahman, Ismail Abdul; Harman, Nur Melly Edora
2014-10-01
This study has assessed the implementation level of Building Information Modeling (BIM) in the construction industry of Malaysia. It also investigated several computer software packages facilitating BIM and challenges affecting its implementation. Data collection for this study was carried out using questionnaire survey among the construction practitioners. 95 completed forms of questionnaire received against 150 distributed questionnaire sets from consultant, contractor and client organizations were analyzed statistically. Analysis findings indicated that the level of implementation of BIM in the construction industry of Malaysia is very low. Average index method employed to assess the effectiveness of various software packages of BIM highlighted that Bentley construction, AutoCAD and ArchiCAD are three most popular and effective software packages. Major challenges to BIM implementation are it requires enhanced collaboration, add work to a designer, interoperability and needs enhanced collaboration. For improving the level of implementing BIM in Malaysian industry, it is recommended that a flexible training program of BIM for all practitioners must be created.
Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine
2016-01-01
Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279
User's Guide for the MapImage Reprojection Software Package, Version 1.01
Finn, Michael P.; Trent, Jason R.
2004-01-01
Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional constraints apply to global raster data. It appears that some packages use the USGS's GCTP or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003b). It is most common for programs to compute transformations of raster data in an inverse fashion. Such mapping can result in an erroneous position and replicate data or create pixels not in the original space. As Usery and others (2003a) indicated, mapimg performs a corresponding forward transformation to ensure the same location results from both methods. The primary benefit of this function is to mask cells outside the domain. MapImage 1.01 is now on the Web. You can download the User's Guide, source, and binaries from the following site: http://mcmcweb.er.usgs.gov/carto_research/projection/acc_proj_data.html
Open-source Software for Exoplanet Atmospheric Modeling
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph
2018-01-01
I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.
Deriving stellar parameters with the SME software package
NASA Astrophysics Data System (ADS)
Piskunov, N.
2017-09-01
Photometry and spectroscopy are complementary tools for deriving accurate stellar parameters. Here I present one of the popular packages for stellar spectroscopy called SME with the emphasis on the latest developments and error assessment for the derived parameters.
Variations in algorithm implementation among quantitative texture analysis software packages
NASA Astrophysics Data System (ADS)
Foy, Joseph J.; Mitta, Prerana; Nowosatka, Lauren R.; Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.; Al-Hallaq, Hania; Armato, Samuel G.
2018-02-01
Open-source texture analysis software allows for the advancement of radiomics research. Variations in texture features, however, result from discrepancies in algorithm implementation. Anatomically matched regions of interest (ROIs) that captured normal breast parenchyma were placed in the magnetic resonance images (MRI) of 20 patients at two time points. Six first-order features and six gray-level co-occurrence matrix (GLCM) features were calculated for each ROI using four texture analysis packages. Features were extracted using package-specific default GLCM parameters and using GLCM parameters modified to yield the greatest consistency among packages. Relative change in the value of each feature between time points was calculated for each ROI. Distributions of relative feature value differences were compared across packages. Absolute agreement among feature values was quantified by the intra-class correlation coefficient. Among first-order features, significant differences were found for max, range, and mean, and only kurtosis showed poor agreement. All six second-order features showed significant differences using package-specific default GLCM parameters, and five second-order features showed poor agreement; with modified GLCM parameters, no significant differences among second-order features were found, and all second-order features showed poor agreement. While relative texture change discrepancies existed across packages, these differences were not significant when consistent parameters were used.
Integrated software system for low level waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worku, G.
1995-12-31
In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less
gr-MRI: A software package for magnetic resonance imaging using software defined radios
NASA Astrophysics Data System (ADS)
Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.
2016-09-01
The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.
NASA Astrophysics Data System (ADS)
Hayrapetyan, David B.; Hovhannisyan, Levon; Mantashyan, Paytsar A.
2013-04-01
The analysis of complex spectra is an actual problem for modern science. The work is devoted to the creation of a software package, which analyzes spectrum in the different formats, possesses by dynamic knowledge database and self-study mechanism, performs automated analysis of the spectra compound based on knowledge database by application of certain algorithms. In the software package as searching systems, hyper-spherical random search algorithms, gradient algorithms and genetic searching algorithms were used. The analysis of Raman and IR spectrum of diamond-like carbon (DLC) samples were performed by elaborated program. After processing the data, the program immediately displays all the calculated parameters of DLC.
Mold Heating and Cooling Pump Package Operator Interface Controls Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josh A. Salmond
2009-08-07
The modernization of the Mold Heating and Cooling Pump Package Operator Interface (MHC PP OI) consisted of upgrading the antiquated single board computer with a proprietary operating system to off-the-shelf hardware and off-the-shelf software with customizable software options. The pump package is the machine interface between a central heating and cooling system that pumps heat transfer fluid through an injection or compression mold base on a local plastic molding machine. The operator interface provides the intelligent means of controlling this pumping process. Strict temperature control of a mold allows the production of high quality parts with tight tolerances and lowmore » residual stresses. The products fabricated are used on multiple programs.« less
EQS Goes R: Simulations for SEM Using the Package REQS
ERIC Educational Resources Information Center
Mair, Patrick; Wu, Eric; Bentler, Peter M.
2010-01-01
The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…
PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.
1997-01-01
The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.
Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong
2015-09-01
This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.
ERIC Educational Resources Information Center
Bitter, Gary G., Ed.
1989-01-01
Describes three software packages: (1) "MacMendeleev"--database/graphic display for chemistry, grades 10-12, Macintosh; (2) "Geometry One: Foundations"--geometry tutorial, grades 7-12, IBM; (3) "Mathematics Exploration Toolkit"--algebra and calculus tutorial, grades 8-12, IBM. (MVL)
Custodial Management in the Information Age.
ERIC Educational Resources Information Center
Harris, Jim, Sr.
1999-01-01
Explains how computerizing the custodial department can be achieved through bar coding, hand-held readers, and the appropriate software packages. Software programs that aid cleaning management, track assets, and manage stock are discussed. (GR)
ERIC Educational Resources Information Center
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor";…
Understanding Medical Words Tutorial: Download Instructions
... are compressed into a zip format, not all software packages will follow the step-by-step directions ... http://www.winzip.com/aboutzip.htm) or similar software and be sure to extract ALL the files ...
The School Advanced Ventilation Engineering Software (SAVES)
The School Advanced Ventilation Engineering Software (SAVES) package is a tool to help school designers assess the potential financial payback and indoor humidity control benefits of Energy Recovery Ventilation (ERV) systems for school applications.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1990
1990-01-01
Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)
Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio
2011-01-01
Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.
Pegg, Elise C; Gill, Harinderjit S
2016-09-06
A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
A Roadmap to Continuous Integration for ATLAS Software Development
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
NASA Technical Reports Server (NTRS)
Nickum, J. D.
1978-01-01
The software package developed for the KIM-1 Micro-System and the Mini-L PLL receiver to simplify taking flight test data is described along with the address and data bus buffers used in the KIM-1 Micro-system. The interface hardware and timing are also presented to describe completely the software programs.
Reviews of Instructional Software in Scholarly Journals: A Selected Bibliography.
ERIC Educational Resources Information Center
Bantz, David A.; And Others
This bibliography lists reviews of more than 100 instructional software packages, which are arranged alphabetically by discipline. Information provided for each entry includes the topical emphasis, type of software (i.e., simulation, tutorial, analysis tool, test generator, database, writing tool, drill, plotting tool, videodisc), the journal…
ERIC Educational Resources Information Center
Watkins, Beverly T.
1992-01-01
Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)
Methods and Software for Building Bibliographic Data Bases.
ERIC Educational Resources Information Center
Daehn, Ralph M.
1985-01-01
This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…
Out of This World Software. Teaching with Technology.
ERIC Educational Resources Information Center
Allen, Denise
1996-01-01
Explains that children have a better sense of environmental awareness and introduces the EarthCare Interactive software that is designed for preschoolers to fourth graders. Briefly describes how to navigate in EarthCare and describes four other software packages: "Job City Literature Connections,""Perfect Landing,""Classroom Occupation," and…
Science for the Home: New Products Tackle Such Weighty Subjects as Immunology, Chemistry.
ERIC Educational Resources Information Center
Mace, Scott
1984-01-01
Discusses trends in science software for home and educational use. Examples of software on various science topics are provided, including packages which revolve around such television shows as "Nova" and "Voyage of the Mimi" and those produced by the Human Engineering Software. (JN)
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan
2017-12-01
Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.
A self-referential HOWTO on release engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galassi, Mark C.
Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less
DMRfinder: efficiently identifying differentially methylated regions from MethylC-seq data.
Gaspar, John M; Hart, Ronald P
2017-11-29
DNA methylation is an epigenetic modification that is studied at a single-base resolution with bisulfite treatment followed by high-throughput sequencing. After alignment of the sequence reads to a reference genome, methylation counts are analyzed to determine genomic regions that are differentially methylated between two or more biological conditions. Even though a variety of software packages is available for different aspects of the bioinformatics analysis, they often produce results that are biased or require excessive computational requirements. DMRfinder is a novel computational pipeline that identifies differentially methylated regions efficiently. Following alignment, DMRfinder extracts methylation counts and performs a modified single-linkage clustering of methylation sites into genomic regions. It then compares methylation levels using beta-binomial hierarchical modeling and Wald tests. Among its innovative attributes are the analyses of novel methylation sites and methylation linkage, as well as the simultaneous statistical analysis of multiple sample groups. To demonstrate its efficiency, DMRfinder is benchmarked against other computational approaches using a large published dataset. Contrasting two replicates of the same sample yielded minimal genomic regions with DMRfinder, whereas two alternative software packages reported a substantial number of false positives. Further analyses of biological samples revealed fundamental differences between DMRfinder and another software package, despite the fact that they utilize the same underlying statistical basis. For each step, DMRfinder completed the analysis in a fraction of the time required by other software. Among the computational approaches for identifying differentially methylated regions from high-throughput bisulfite sequencing datasets, DMRfinder is the first that integrates all the post-alignment steps in a single package. Compared to other software, DMRfinder is extremely efficient and unbiased in this process. DMRfinder is free and open-source software, available on GitHub ( github.com/jsh58/DMRfinder ); it is written in Python and R, and is supported on Linux.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tharrington, Arnold N.
2015-09-09
The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.
Pre-Mastering and CD-WO Evaluations
NASA Technical Reports Server (NTRS)
Hecox, D.; Hyon, J.; Martin, M.; Marski, K.; Shields, E.; Sorensen, S.; Teramae, S.
1993-01-01
This article reviews the features and functionality of five desktop pre-mastering software packages for the PC. Desktop pre-mastering packages are aimed primarily at end-users interested in bringing CD-ROM publishing tasks in-house, rather than traditional CD-ROM developers.
24 CFR 208.108 - Requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... package to process certifications and recertifications and to provide subsidy billings to HUD must update their software packages and begin electronic transmission of that data in a HUD specified format by... TRANSMISSION OF REQUIRED DATA FOR CERTIFICATION AND RECERTIFICATION AND SUBSIDY BILLING PROCEDURES FOR...
24 CFR 208.108 - Requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... package to process certifications and recertifications and to provide subsidy billings to HUD must update their software packages and begin electronic transmission of that data in a HUD specified format by... TRANSMISSION OF REQUIRED DATA FOR CERTIFICATION AND RECERTIFICATION AND SUBSIDY BILLING PROCEDURES FOR...
24 CFR 208.108 - Requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... package to process certifications and recertifications and to provide subsidy billings to HUD must update their software packages and begin electronic transmission of that data in a HUD specified format by... TRANSMISSION OF REQUIRED DATA FOR CERTIFICATION AND RECERTIFICATION AND SUBSIDY BILLING PROCEDURES FOR...
24 CFR 208.108 - Requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... package to process certifications and recertifications and to provide subsidy billings to HUD must update their software packages and begin electronic transmission of that data in a HUD specified format by... TRANSMISSION OF REQUIRED DATA FOR CERTIFICATION AND RECERTIFICATION AND SUBSIDY BILLING PROCEDURES FOR...
SolarSoft Desat Package for the Recovery of Saturated AIA Flare Images
NASA Astrophysics Data System (ADS)
Schwartz, Richard Alan; Torre, Gabriele; Piana, Michele; Massone, AnnaMaria
2015-04-01
The dynamic range of EUV images has been limited by the problem of CCD saturation as seen countless times in movies of solare flares made using the Solar Dynamics Observatory’s Atmospheric Imaging Assembly (SDO AIA). Concurrent with the saturation are the eight rays emanating from the saturation locus which are the result of diffraction off the wire meshes that support the EUV passband filters. This is the problem and its solution in a nutshell. By utilizing techniques similar to those used for making images from the rotating modulation collimators on the Ramaty High Energy Solar Spectroscopic Imager (RHESSI) we have developed a software package that can be used to make images of the EUV flare kernels in a highly automated way as described in Schwartz et al. (2014). Starting from cutouts centered around a flaring region, the software uses the point-spread-function (PSF) of the diffraction pattern to identify and reconstruct the region of the primary saturation. The software also uses the best information available to reconstruct the general scene obscured from overflow saturation and subtracts away the diffraction fringes. It is not a total correction for the PSF but is meant to provide the flare images above all. The software is freely available and distributed within the DESAT package of Solar Software.(Schwartz, R. A., Torre, G., & Piana, M. (2014), Astrophysical Journal Letters, 793, LL23 )
Hill, Jon; Davis, Katie E
2014-01-01
Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.
GP Workbench Manual: Technical Manual, User's Guide, and Software Guide
Oden, Charles P.; Moulton, Craig W.
2006-01-01
GP Workbench is an open-source general-purpose geophysical data processing software package written primarily for ground penetrating radar (GPR) data. It also includes support for several USGS prototype electromagnetic instruments such as the VETEM and ALLTEM. The two main programs in the package are GP Workbench and GP Wave Utilities. GP Workbench has routines for filtering, gridding, and migrating GPR data; as well as an inversion routine for characterizing UXO (unexploded ordinance) using ALLTEM data. GP Workbench provides two-dimensional (section view) and three-dimensional (plan view or time slice view) processing for GPR data. GP Workbench can produce high-quality graphics for reports when Surfer 8 or higher (Golden Software) is installed. GP Wave Utilities provides a wide range of processing algorithms for single waveforms, such as filtering, correlation, deconvolution, and calculating GPR waveforms. GP Wave Utilities is used primarily for calibrating radar systems and processing individual traces. Both programs also contain research features related to the calibration of GPR systems and calculating subsurface waveforms. The software is written to run on the Windows operating systems. GP Workbench can import GPR data file formats used by major commercial instrument manufacturers including Sensors and Software, GSSI, and Mala. The GP Workbench native file format is SU (Seismic Unix), and subsequently, files generated by GP Workbench can be read by Seismic Unix as well as many other data processing packages.
NASA Technical Reports Server (NTRS)
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
ERIC Educational Resources Information Center
Science and Children, 1989
1989-01-01
Reviews of seven software packages are presented including "The Environment I: Habitats and EcoSystems; II Cycles and Interactions"; "Super Sign Maker"; "The Great Knowledge Race: Substance Abuse"; "Exploring Science: Temperature"; "Fast Food Calculator and RD Aide"; "The Human Body:…
ERIC Educational Resources Information Center
Ruben, Barbara
1994-01-01
Reviews a number of interactive environmental computer education networks and software packages. Computer networks include National Geographic Kids Network, Global Lab, and Global Rivers Environmental Education Network. Computer software involve environmental decision making, simulation games, tropical rainforests, the ocean, the greenhouse…
ERIC Educational Resources Information Center
Mackenzie, Norma N.; And Others
1988-01-01
Reviews four computer software packages including: "The Physical Science Series: Sound" which demonstrates making waves, speed of sound, doppler effect, and human hearing; "Andromeda" depicting celestial motions in any direction; "Biology Quiz: Humans" covering chemistry, cells, viruses, and human biology; and…
Computer Center: Software Review.
ERIC Educational Resources Information Center
Duhrkopf, Richard, Ed.; Belshe, John F., Ed.
1988-01-01
Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)
Building CHAOS: An Operating System for Livermore Linux Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlick, J E; Dunlap, C M
2003-02-21
The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less
XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.
Ching, Daniel J; Gürsoy, Dogˇa
2017-03-01
The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Gas flow calculation method of a ramjet engine
NASA Astrophysics Data System (ADS)
Kostyushin, Kirill; Kagenov, Anuar; Eremin, Ivan; Zhiltsov, Konstantin; Shuvarikov, Vladimir
2017-11-01
At the present study calculation methodology of gas dynamics equations in ramjet engine is presented. The algorithm is based on Godunov`s scheme. For realization of calculation algorithm, the system of data storage is offered, the system does not depend on mesh topology, and it allows using the computational meshes with arbitrary number of cell faces. The algorithm of building a block-structured grid is given. Calculation algorithm in the software package "FlashFlow" is implemented. Software package is verified on the calculations of simple configurations of air intakes and scramjet models.
The development of an engineering computer graphics laboratory
NASA Technical Reports Server (NTRS)
Anderson, D. C.; Garrett, R. E.
1975-01-01
Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.
XDesign: An open-source software package for designing X-ray imaging phantoms and experiments
Ching, Daniel J.; Gursoy, Dogˇa
2017-02-21
Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Scherer, Michael D; Kattadiyil, Mathew T; Parciak, Ewa; Puri, Shweta
2014-01-01
Three-dimensional radiographic imaging for dental implant treatment planning is gaining widespread interest and popularity. However, application of the data from 30 imaging can be a complex and daunting process initially. The purpose of this article is to describe features of three software packages and the respective computerized guided surgical templates (GST) fabricated from them. A step-by-step method of interpreting and ordering a GST to simplify the process of the surgical planning and implant placement is discussed.
Lessons Learned in Cyberspace Security
2014-06-01
software; something undesirable is packaged together with something desirable. A classic example was Elf Bowling attachment, which ran rampant through...the authors’ former school. It combined a fun program featuring elves as bowling pins, however it was packaged with SubSeven (Sub7) malware that...allowed remote access to the infected machine. IExpress, which is delivered in the Windows OS, is one of the legitimate tools for packaging multiple
Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry.
Villarrubia, J S; Tondare, V N; Vladár, A E
2016-01-01
The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples-mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.
GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.
Nazarian, Alireza; Gezan, Salvador Alejandro
2016-07-01
Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Implementation of interconnect simulation tools in spice
NASA Technical Reports Server (NTRS)
Satsangi, H.; Schutt-Aine, J. E.
1993-01-01
Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.
Scilab software as an alternative low-cost computing in solving the linear equations problem
NASA Astrophysics Data System (ADS)
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
The hypercholinergic Flinders Sensitive Line (FSL) rat was significantly more sensitive than the Flinders Resistant Line (FRL) rat to the biotelemetrically recorded hypothermic effects of oxotremorine, a directly acting muscarinic agonist, and diisopropyl fluorophosphate (DFP), a...
ERIC Educational Resources Information Center
Ahl, David H.
1985-01-01
The "College Explorer" is a software package (for the 64K Apple II, IBM PC, TRS-80 model III and 4 microcomputers) which aids in choosing a college. The major features of this package (manufactured by The College Board) are described and evaluated. Sample input/output is included. (JN)
Reviews: The Molecular Animator.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1987
1987-01-01
Provided is a review of a chemical software package. The package makes possible an instructional technique that is not effective by any other means, namely the ability to view molecular shapes in three dimensions. The program can be used with either IBM or Apple hardware. (RH)
INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT
A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...
An FPGA-Based People Detection System
NASA Astrophysics Data System (ADS)
Nair, Vinod; Laprise, Pierre-Olivier; Clark, James J.
2005-12-01
This paper presents an FPGA-based system for detecting people from video. The system is designed to use JPEG-compressed frames from a network camera. Unlike previous approaches that use techniques such as background subtraction and motion detection, we use a machine-learning-based approach to train an accurate detector. We address the hardware design challenges involved in implementing such a detector, along with JPEG decompression, on an FPGA. We also present an algorithm that efficiently combines JPEG decompression with the detection process. This algorithm carries out the inverse DCT step of JPEG decompression only partially. Therefore, it is computationally more efficient and simpler to implement, and it takes up less space on the chip than the full inverse DCT algorithm. The system is demonstrated on an automated video surveillance application and the performance of both hardware and software implementations is analyzed. The results show that the system can detect people accurately at a rate of about[InlineEquation not available: see fulltext.] frames per second on a Virtex-II 2V1000 using a MicroBlaze processor running at[InlineEquation not available: see fulltext.], communicating with dedicated hardware over FSL links.
Bravo, Paco E; Chien, David; Javadi, Mehrbod; Merrill, Jennifer; Bengel, Frank M
2010-06-01
Electrocardiographic gating is increasingly used for (82)Rb cardiac PET/CT, but reference ranges for global functional parameters are not well defined. We sought to establish reference values for left ventricular ejection fraction (LVEF), end systolic volume (ESV), and end diastolic volume (EDV) using 4 different commercial software packages. Additionally, we compared 2 different approaches for the definition of a healthy individual. Sixty-two subjects (mean age +/- SD, 49 +/- 9 y; 85% women; mean body mass index +/- SD, 34 +/- 10 kg/m(2)) who underwent (82)Rb-gated myocardial perfusion PET/CT were evaluated. All subjects had normal myocardial perfusion and no history of coronary artery disease (CAD) or cardiomyopathy. Subgroup 1 consisted of 34 individuals with low pretest probability of CAD (<10%), and subgroup 2 comprised 28 subjects who had no atherosclerosis on a coronary CT angiogram obtained concurrently during the PET/CT session. LVEF, ESV, and EDV were calculated at rest and during dipyridamole-induced stress, using CardIQ Physio (a dedicated PET software) and the 3 major SPECT software packages (Emory Cardiac Toolbox, Quantitative Gated SPECT, and 4DM-SPECT). Mean LVEF was significantly different among all 4 software packages. LVEF was most comparable between CardIQ Physio (62% +/- 6% and 54% +/- 7% at stress and rest, respectively) and 4DM-SPECT (64% +/- 7% and 56% +/- 8%, respectively), whereas Emory Cardiac Toolbox yielded higher values (71% +/- 6% and 65% +/- 6%, respectively, P < 0.001) and Quantitated Gated SPECT lower values (56% +/- 8% and 50% +/- 8%, respectively, P < 0.001). Subgroup 1 (low likelihood) demonstrated higher LVEF values than did subgroup 2 (normal CT angiography findings), using all software packages (P < 0.05). However, mean ESV and EDV at stress and rest were comparable between both subgroups (p = NS). Intra- and interobserver agreement were excellent for all methods. The reference range of LVEF and LV volumes from gated (82)Rb PET/CT varies significantly among available software programs and therefore cannot be used interchangeably. LVEF results were higher when healthy subjects were defined by a low pretest probability of CAD than by normal CT angiography results.
Software for Middle School Physical Science.
ERIC Educational Resources Information Center
Podany, Zita
This final report in the MicroSIFT series reviews 10 software packages that deal mainly with the areas of electricity, magnetism, and heat energy. Software titles appearing in this report were selected because they were judged to be exemplary according to various criteria in the MicroSIFT Evaluator's Guide, with some additions to address science…
CIP's Eighth Annual Educational Software Contest: The Winners.
ERIC Educational Resources Information Center
Donnelly, Denis
1997-01-01
Announces the winners of an annual software contest for innovative software in physics education. Winning entries include an application to help students visualize the origin of energy bands in a solid, a package on the radioastronomy of pulsars, and a school-level science simulation program. Also includes student winners, honorable mentions,…
Resource Sharing of Micro-Software, or, What Ever Happened to All That CP/M Compatibility?
ERIC Educational Resources Information Center
DeYoung, Barbara
1984-01-01
Explores incompatible operating systems as the basic reason why software packages will not work on different microcomputers; defines operating system; explores compatibility issues surrounding the IBM MS-DOS; and presents two future trends in hardware and software developments which indicate a return to true compatibility. (Author/MBR)
Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.
ERIC Educational Resources Information Center
Rice, James
1988-01-01
Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…
ERIC Educational Resources Information Center
Dresden Associates, Dresden, ME.
This preliminary directory represents the offerings of 45 software suppliers and information about instructional software currently available for three microcomputers widely used in schools. It is geared towards a wide variety of users including school planners contemplating microcomputer acquisition, teachers planning courses and curricula, media…
Integrating Statistical Visualization Research into the Political Science Classroom
ERIC Educational Resources Information Center
Draper, Geoffrey M.; Liu, Baodong; Riesenfeld, Richard F.
2011-01-01
The use of computer software to facilitate learning in political science courses is well established. However, the statistical software packages used in many political science courses can be difficult to use and counter-intuitive. We describe the results of a preliminary user study suggesting that visually-oriented analysis software can help…