Sample records for standardized computer analyses

  1. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less

  2. Onboard Navigation Systems Characteristics

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The space shuttle onboard navigation systems characteristics are described. A standard source of equations and numerical data for use in error analyses and mission simulations related to space shuttle development is reported. The sensor characteristics described are used for shuttle onboard navigation performance assessment. The use of complete models in the studies depend on the analyses to be performed, the capabilities of the computer programs, and the availability of computer resources.

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  4. Effect size calculation in meta-analyses of psychotherapy outcome research.

    PubMed

    Hoyt, William T; Del Re, A C

    2018-05-01

    Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.

  5. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    ERIC Educational Resources Information Center

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  6. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    PubMed

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  7. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    PubMed

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  8. Numerical Simulation Of Cutting Of Gear Teeth

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Huston, Ronald L.; Mavriplis, Dimitrios

    1994-01-01

    Shapes of gear teeth produced by gear cutters of specified shape simulated computationally, according to approach based on principles of differential geometry. Results of computer simulation displayed as computer graphics and/or used in analyses of design, manufacturing, and performance of gears. Applicable to both standard and non-standard gear-tooth forms. Accelerates and facilitates analysis of alternative designs of gears and cutters. Simulation extended to study generation of surfaces other than gears. Applied to cams, bearings, and surfaces of arbitrary rolling elements as well as to gears. Possible to develop analogous procedures for simulating manufacture of skin surfaces like automobile fenders, airfoils, and ship hulls.

  9. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  10. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  11. Audio-Visual Perception of 3D Cinematography: An fMRI Study Using Condition-Based and Computation-Based Analyses

    PubMed Central

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard “condition-based” designs, as well as “computational” methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli. PMID:24194828

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eperin, A.P.; Zakharzhevsky, Yu.O.; Arzhaev, A.I.

    A two-year Finnish-Russian cooperation program has been initiated in 1995 to demonstrate the applicability of the leak-before-break concept (LBB) to the primary circuit piping of the Leningrad NPP. The program includes J-R curve testing of authentic pipe materials at full operating temperature, screening and computational LBB analyses complying with the USNRC Standard Review Plan 3.6.3, and exchange of LBB-related information with emphasis on NDE. Domestic computer codes are mainly used, and all tests and analyses are independently carried out by each party. The results are believed to apply generally to RBMK type plants of the first generation.

  13. The impact of slice-reduced computed tomography on histogram-based densitometry assessment of lung fibrosis in patients with systemic sclerosis.

    PubMed

    Nguyen-Kim, Thi Dan Linh; Maurer, Britta; Suliman, Yossra A; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas

    2018-04-01

    To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051-0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients.

  14. An efficient numerical technique for calculating thermal spreading resistance

    NASA Technical Reports Server (NTRS)

    Gale, E. H., Jr.

    1977-01-01

    An efficient numerical technique for solving the equations resulting from finite difference analyses of fields governed by Poisson's equation is presented. The method is direct (noniterative)and the computer work required varies with the square of the order of the coefficient matrix. The computational work required varies with the cube of this order for standard inversion techniques, e.g., Gaussian elimination, Jordan, Doolittle, etc.

  15. Empirical Synthesis of the Effect of Standard Error of Measurement on Decisions Made within Brief Experimental Analyses of Reading Fluency

    ERIC Educational Resources Information Center

    Burns, Matthew K.; Taylor, Crystal N.; Warmbold-Brann, Kristy L.; Preast, June L.; Hosp, John L.; Ford, Jeremy W.

    2017-01-01

    Intervention researchers often use curriculum-based measurement of reading fluency (CBM-R) with a brief experimental analysis (BEA) to identify an effective intervention for individual students. The current study synthesized data from 22 studies that used CBM-R data within a BEA by computing the standard error of measure (SEM) for the median data…

  16. GPU-computing in econophysics and statistical physics

    NASA Astrophysics Data System (ADS)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  17. Perl One-Liners: Bridging the Gap Between Large Data Sets and Analysis Tools.

    PubMed

    Hokamp, Karsten

    2015-01-01

    Computational analyses of biological data are becoming increasingly powerful, and researchers intending on carrying out their own analyses can often choose from a wide array of tools and resources. However, their application might be obstructed by the wide variety of different data formats that are in use, from standard, commonly used formats to output files from high-throughput analysis platforms. The latter are often too large to be opened, viewed, or edited by standard programs, potentially leading to a bottleneck in the analysis. Perl one-liners provide a simple solution to quickly reformat, filter, and merge data sets in preparation for downstream analyses. This chapter presents example code that can be easily adjusted to meet individual requirements. An online version is available at http://bioinf.gen.tcd.ie/pol.

  18. The impact of slice-reduced computed tomography on histogram-based densitometry assessment of lung fibrosis in patients with systemic sclerosis

    PubMed Central

    Maurer, Britta; Suliman, Yossra A.; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas

    2018-01-01

    Background To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. Methods From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. Results With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051–0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Conclusions Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients. PMID:29850118

  19. Assessment of Computer Literacy of Nurses in Lesotho.

    PubMed

    Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso

    2016-11-01

    Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.

  20. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  1. Evaluation of a continuous-rotation, high-speed scanning protocol for micro-computed tomography.

    PubMed

    Kerl, Hans Ulrich; Isaza, Cristina T; Boll, Hanne; Schambach, Sebastian J; Nolte, Ingo S; Groden, Christoph; Brockmann, Marc A

    2011-01-01

    Micro-computed tomography is used frequently in preclinical in vivo research. Limiting factors are radiation dose and long scan times. The purpose of the study was to compare a standard step-and-shoot to a continuous-rotation, high-speed scanning protocol. Micro-computed tomography of a lead grid phantom and a rat femur was performed using a step-and-shoot and a continuous-rotation protocol. Detail discriminability and image quality were assessed by 3 radiologists. The signal-to-noise ratio and the modulation transfer function were calculated, and volumetric analyses of the femur were performed. The radiation dose of the scan protocols was measured using thermoluminescence dosimeters. The 40-second continuous-rotation protocol allowed a detail discriminability comparable to the step-and-shoot protocol at significantly lower radiation doses. No marked differences in volumetric or qualitative analyses were observed. Continuous-rotation micro-computed tomography significantly reduces scanning time and radiation dose without relevantly reducing image quality compared with a normal step-and-shoot protocol.

  2. MEMOPS: data modelling and automatic code generation.

    PubMed

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  3. The History of the AutoChemist®: From Vision to Reality.

    PubMed

    Peterson, H E; Jungner, I

    2014-05-22

    This paper discusses the early history and development of a clinical analyser system in Sweden (AutoChemist, 1965). It highlights the importance of such high capacity system both for clinical use and health care screening. The device was developed to assure the quality of results and to automatically handle the orders, store the results in digital form for later statistical analyses and distribute the results to the patients' physicians by using the computer used for the analyser. The most important result of the construction of an analyser able to produce analytical results on a mass scale was the development of a mechanical multi-channel analyser for clinical laboratories that handled discrete sample technology and could prevent carry-over to the next test samples while incorporating computer technology to improve the quality of test results. The AutoChemist could handle 135 samples per hour in an 8-hour shift and up to 24 possible analyses channels resulting in 3,200 results per hour. Later versions would double this capacity. Some customers used the equipment 24 hours per day. With a capacity of 3,000 to 6,000 analyses per hour, pneumatic driven pipettes, special units for corrosive liquids or special activities, and an integrated computer, the AutoChemist system was unique and the largest of its kind for many years. Its follower - The AutoChemist PRISMA (PRogrammable Individually Selective Modular Analyzer) - was smaller in size but had a higher capacity. Both analysers established new standards of operation for clinical laboratories and encouraged others to use new technologies for building new analysers.

  4. The development and piloting of electronic standardized measures on nursing work: combining engineering and nursing knowledge.

    PubMed

    Bragadóttir, Helga; Gunnarsdóttir, Sigrún; Ingason, Helgi T

    2013-05-01

    This paper describes the development and piloting of electronic standardized measures on nursing work (e-SMNW) for rich data gathering on the work and work environment of registered nurses (RNs) and practical nurses (PNs). Efficient and valid methods are needed to measure nursing work to enhance the optimal use of the nursing workforce for safe patient care. The study combined human factors engineering (HFE) and nursing knowledge to develop electronic standardized measures for observational studies on nursing work in acute care. The work and work environment of RNs and PNs in acute care medical and surgical inpatient units was successfully measured using e-SMNW. With predetermined items of work activities and influencing factors in the work of nurses, and full use of computer technology, multi-layered rich standardized data were gathered, analysed and displayed. The combination of nursing knowledge, HFE and computer technology enables observational data collection for a rich picture of the complex work of nursing. Information collected by standardized and multi-layered measures makes it easier to identify potential improvements, with regard to influencing factors and management of the work and work environment of nurses. Further use of computer technology in health services research is encouraged. © 2012 Blackwell Publishing Ltd.

  5. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  6. Does H → γγ taste like vanilla new physics?

    NASA Astrophysics Data System (ADS)

    Almeida, L. G.; Bertuzzo, E.; Machado, P. A. N.; Funchal, R. Zukanovich

    2012-11-01

    We analyse the interplay between the Higgs to diphoton rate and electroweak precision measurements constraints in extensions of the Standard Model with new uncolored charged fermions that do not mix with the ordinary ones. We also compute the pair production cross sections for the lightest fermion and compare them with current bounds.

  7. Innovation from a Computational Social Science Perspective: Analyses and Models

    ERIC Educational Resources Information Center

    Casstevens, Randy M.

    2013-01-01

    Innovation processes are critical for preserving and improving our standard of living. While innovation has been studied by many disciplines, the focus has been on qualitative measures that are specific to a single technological domain. I adopt a quantitative approach to investigate underlying regularities that generalize across multiple domains.…

  8. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.

    PubMed

    Yiu, Sean; Tom, Brian Dm

    2017-01-01

    Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.

  9. SNL Mechanical Computer Aided Design (MCAD) guide 2007.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Brandon; Pollice, Stephanie L.; Martinez, Jack R.

    2007-12-01

    This document is considered a mechanical design best-practice guide to new and experienced designers alike. The contents consist of topics related to using Computer Aided Design (CAD) software, performing basic analyses, and using configuration management. The details specific to a particular topic have been leveraged against existing Product Realization Standard (PRS) and Technical Business Practice (TBP) requirements while maintaining alignment with sound engineering and design practices. This document is to be considered dynamic in that subsequent updates will be reflected in the main title, and each update will be published on an annual basis.

  10. Equilibrium paths analysis of materials with rheological properties by using the chaos theory

    NASA Astrophysics Data System (ADS)

    Bednarek, Paweł; Rządkowski, Jan

    2018-01-01

    The numerical equilibrium path analysis of the material with random rheological properties by using standard procedures and specialist computer programs was not successful. The proper solution for the analysed heuristic model of the material was obtained on the base of chaos theory elements and neural networks. The paper deals with mathematical reasons of used computer programs and also are elaborated the properties of the attractor used in analysis. There are presented results of conducted numerical analysis both in a numerical and in graphical form for the used procedures.

  11. Numerical Analyses for Low Reynolds Flow in a Ventricular Assist Device.

    PubMed

    Lopes, Guilherme; Bock, Eduardo; Gómez, Luben

    2017-06-01

    Scientific and technological advances in blood pump developments have been driven by their importance in cardiac patient treatments and in the expansion of life quality in assisted people. To improve and optimize the design and development, numerical tools were incorporated into the analyses of these mechanisms and have become indispensable in their advances. This study analyzes the flow behavior with low impeller Reynolds number, for which there is no consensus on the full development of turbulence in ventricular assist devices (VAD). For supporting analyses, computational numerical simulations were carried out in different scenarios with the same rotation speed. Two modeling approaches were applied: laminar flow and turbulent flow with the standard, RNG and realizable κ - ε; the standard and SST κ - ω models; and Spalart-Allmaras models. The results agree with the literature for VAD and the range for transient flows in stirred tanks with an impeller Reynolds number around 2800 for the tested scenarios. The turbulent models were compared, and it is suggested, based on the expected physical behavior, the use of κ-ε RNG, standard and SST κ-ω, and Spalart-Allmaras models to numerical analyses for low impeller Reynolds numbers according to the tested flow scenarios. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  12. Computational and Experimental Flow Field Analyses of Separate Flow Chevron Nozzles and Pylon Interaction

    NASA Technical Reports Server (NTRS)

    Massey, Steven J.; Thomas, Russell H.; AbdolHamid, Khaled S.; Elmiligui, Alaa A.

    2003-01-01

    A computational and experimental flow field analyses of separate flow chevron nozzles is presented. The goal of this study is to identify important flow physics and modeling issues required to provide highly accurate flow field data which will later serve as input to the Jet3D acoustic prediction code. Four configurations are considered: a baseline round nozzle with and without a pylon, and a chevron core nozzle with and without a pylon. The flow is simulated by solving the asymptotically steady, compressible, Reynolds-averaged Navier-Stokes equations using an implicit, up-wind, flux-difference splitting finite volume scheme and standard two-equation kappa-epsilon turbulence model with a linear stress representation and the addition of a eddy viscosity dependence on total temperature gradient normalized by local turbulence length scale. The current CFD results are seen to be in excellent agreement with Jet Noise Lab data and show great improvement over previous computations which did not compensate for enhanced mixing due to high temperature gradients.

  13. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  14. An interactive environment for agile analysis and visualization of ChIP-sequencing data.

    PubMed

    Lerdrup, Mads; Johansen, Jens Vilstrup; Agrawal-Singh, Shuchi; Hansen, Klaus

    2016-04-01

    To empower experimentalists with a means for fast and comprehensive chromatin immunoprecipitation sequencing (ChIP-seq) data analyses, we introduce an integrated computational environment, EaSeq. The software combines the exploratory power of genome browsers with an extensive set of interactive and user-friendly tools for genome-wide abstraction and visualization. It enables experimentalists to easily extract information and generate hypotheses from their own data and public genome-wide datasets. For demonstration purposes, we performed meta-analyses of public Polycomb ChIP-seq data and established a new screening approach to analyze more than 900 datasets from mouse embryonic stem cells for factors potentially associated with Polycomb recruitment. EaSeq, which is freely available and works on a standard personal computer, can substantially increase the throughput of many analysis workflows, facilitate transparency and reproducibility by automatically documenting and organizing analyses, and enable a broader group of scientists to gain insights from ChIP-seq data.

  15. A Study to Determine the Need for a Standard Limiting the Horsepower of Recreational Boats.

    DTIC Science & Technology

    1978-09-01

    Acceptance Number of Number Fatal Accidents Non -Fatal Accidents - (Lost control ) 1 93 2 :No attempt to avoid collision) 1 19 72 fAttempted to avoic, not enough...base, and an explanation of the computer SModel designed to aid in organizing and analyzing the data are presented with the results of the analyses. An...Standard 75 S 3.2 Non -Powering Related Accident Sample 76 3.3 Coded Information and Coding Form 77 • - 3.4 Effectiveness Evaluation of the Current

  16. Laboratory techniques and rhythmometry

    NASA Technical Reports Server (NTRS)

    Halberg, F.

    1973-01-01

    Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.

  17. Improving estimates of streamflow characteristics using LANDSAT-1 (ERTS-1) imagery. [Delmarva Peninsula

    NASA Technical Reports Server (NTRS)

    Hollyday, E. F. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Streamflow characteristics in the Delmarva Peninsula derived from the records of daily discharge of 20 gaged basins are representative of the full range in flow conditions and include all of those commonly used for design or planning purposes. They include annual flood peaks with recurrence intervals of 2, 5, 10, 25, and 50 years, mean annual discharge, standard deviation of the mean annual discharge, mean monthly discharges, standard deviation of the mean monthly discharges, low-flow characteristics, flood volume characteristics, and the discharge equalled or exceeded 50 percent of the time. Streamflow and basin characteristics were related by a technique of multiple regression using a digital computer. A control group of equations was computed using basin characteristics derived from maps and climatological records. An experimental group of equations was computed using basin characteristics derived from LANDSAT imagery as well as from maps and climatological records. Based on a reduction in standard error of estimate equal to or greater than 10 percent, the equations for 12 stream flow characteristics were substantially improved by adding to the analyses basin characteristics derived from LANDSAT imagery.

  18. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  19. Effects of computer-based training on procedural modifications to standard functional analyses.

    PubMed

    Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.

  20. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  1. 4P: fast computing of population genetics statistics from large DNA polymorphism panels

    PubMed Central

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations. PMID:25628874

  2. Consulting room computers and their effect on general practitioner-patient communication.

    PubMed

    Noordman, Janneke; Verhaak, Peter; van Beljouw, Ilse; van Dulmen, Sandra

    2010-12-01

    in the western medical world, computers form part of the standard equipment in the consulting rooms of most GPs. As the use of a computer requires time and attention from GPs, this may well interfere with the communication process. Yet, the information accessed on the computer may also enhance communication. the present study affords insight into the relationship between computer use and GP-patient communication recorded by the same GPs over two periods. videotaped GP consultations collected in 2001 and 2008 were used to observe computer use and GP-patient communication. In addition, patients questionnaires about their experiences with communication by the GP were analysed using multilevel models with patients (Level 1) nested within GPs (Level 2). both in 2008 and in 2001, GPs used their computer in almost every consultation. Still, our study showed a change in computer use by the GPs over time. In addition, the results indicate that computer use is negatively related to some communication aspects: the patient-directed gaze of the GP and the amount of information given by GPs. There is also a negative association between computer use and the body posture of the GP. Computer use by GPs is not associated with other (analysed) non-verbal and verbal behaviour of GPs and patients. Moreover, computer use is scarcely related to patients' experiences with the communication behaviour of the GP. GPs show greater reluctance to use computers in 2008 compared to 2001. Computer use can indeed affect the communication between GPs and patients. Therefore, GPs ought to remain aware of their computer use during consultations and at the same time keep the interaction with the patient alive.

  3. Cryogenic Information Center

    NASA Technical Reports Server (NTRS)

    Mohling, Robert A.; Marquardt, Eric D.; Fusilier, Fred C.; Fesmire, James E.

    2003-01-01

    The Cryogenic Information Center (CIC) is a not-for-profit corporation dedicated to preserving and distributing cryogenic information to government, industry, and academia. The heart of the CIC is a uniform source of cryogenic data including analyses, design, materials and processes, and test information traceable back to the Cryogenic Data Center of the former National Bureau of Standards. The electronic database is a national treasure containing over 146,000 specific bibliographic citations of cryogenic literature and thermophysical property data dating back to 1829. A new technical/bibliographic inquiry service can perform searches and technical analyses. The Cryogenic Material Properties (CMP) Program consists of computer codes using empirical equations to determine thermophysical material properties with emphasis on the 4-300K range. CMP's objective is to develop a user-friendly standard material property database using the best available data so government and industry can conduct more accurate analyses. The CIC serves to benefit researchers, engineers, and technologists in cryogenics and cryogenic engineering, whether they are new or experienced in the field.

  4. Comprehensive analysis of a Radiology Operations Management computer system.

    PubMed

    Arenson, R L; London, J W

    1979-11-01

    The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.

  5. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  6. Evaluating the Comparability of Paper-and-Pencil and Computerized Versions of a Large-Scale Certification Test. Research Report. ETS RR-05-21

    ERIC Educational Resources Information Center

    Puhan, Gautam; Boughton, Keith A.; Kim, Sooyeon

    2005-01-01

    The study evaluated the comparability of two versions of a teacher certification test: a paper-and-pencil test (PPT) and computer-based test (CBT). Standardized mean difference (SMD) and differential item functioning (DIF) analyses were used as measures of comparability at the test and item levels, respectively. Results indicated that effect sizes…

  7. Computation of forces from deformed visco-elastic biological tissues

    NASA Astrophysics Data System (ADS)

    Muñoz, José J.; Amat, David; Conte, Vito

    2018-04-01

    We present a least-squares based inverse analysis of visco-elastic biological tissues. The proposed method computes the set of contractile forces (dipoles) at the cell boundaries that induce the observed and quantified deformations. We show that the computation of these forces requires the regularisation of the problem functional for some load configurations that we study here. The functional measures the error of the dynamic problem being discretised in time with a second-order implicit time-stepping and in space with standard finite elements. We analyse the uniqueness of the inverse problem and estimate the regularisation parameter by means of an L-curved criterion. We apply the methodology to a simple toy problem and to an in vivo set of morphogenetic deformations of the Drosophila embryo.

  8. Analyses of requirements for computer control and data processing experiment subsystems. Volume 2: ATM experiment S-056 image data processing system software development

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.

  9. Developments in REDES: The rocket engine design expert system

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  10. Developments in REDES: The Rocket Engine Design Expert System

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  11. Gluon-fusion Higgs production in the Standard Model Effective Field Theory

    NASA Astrophysics Data System (ADS)

    Deutschmann, Nicolas; Duhr, Claude; Maltoni, Fabio; Vryonidou, Eleni

    2017-12-01

    We provide the complete set of predictions needed to achieve NLO accuracy in the Standard Model Effective Field Theory at dimension six for Higgs production in gluon fusion. In particular, we compute for the first time the contribution of the chromomagnetic operator {\\overline{Q}}_LΦ σ {q}_RG at NLO in QCD, which entails two-loop virtual and one-loop real contributions, as well as renormalisation and mixing with the Yukawa operator {Φ}^{\\dagger}Φ{\\overline{Q}}_LΦ {q}_R and the gluon-fusion operator Φ†Φ GG. Focusing on the top-quark-Higgs couplings, we consider the phenomenological impact of the NLO corrections in constraining the three relevant operators by implementing the results into the M adG raph5_ aMC@NLO frame-work. This allows us to compute total cross sections as well as to perform event generation at NLO that can be directly employed in experimental analyses.

  12. VCF-Explorer: filtering and analysing whole genome VCF files.

    PubMed

    Akgün, Mete; Demirci, Hüseyin

    2017-11-01

    The decreasing cost in high-throughput technologies led to a number of sequencing projects consisting of thousands of whole genomes. The paradigm shift from exome to whole genome brings a significant increase in the size of output files. Most of the existing tools which are developed to analyse exome files are not adequate for larger VCF files produced by whole genome studies. In this work we present VCF-Explorer, a variant analysis software capable of handling large files. Memory efficiency and avoiding computationally costly pre-processing step enable to carry out the analysis to be performed with ordinary computers. VCF-Explorer provides an easy to use environment where users can define various types of queries based on variant and sample genotype level annotations. VCF-Explorer can be run in different environments and computational platforms ranging from a standard laptop to a high performance server. VCF-Explorer is freely available at: http://vcfexplorer.sourceforge.net/. mete.akgun@tubitak.gov.tr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Development of the PROMIS positive emotional and sensory expectancies of smoking item banks.

    PubMed

    Tucker, Joan S; Shadel, William G; Edelen, Maria Orlando; Stucky, Brian D; Li, Zhen; Hansen, Mark; Cai, Li

    2014-09-01

    The positive emotional and sensory expectancies of cigarette smoking include improved cognitive abilities, positive affective states, and pleasurable sensorimotor sensations. This paper describes development of Positive Emotional and Sensory Expectancies of Smoking item banks that will serve to standardize the assessment of this construct among daily and nondaily cigarette smokers. Data came from daily (N = 4,201) and nondaily (N =1,183) smokers who completed an online survey. To identify a unidimensional set of items, we conducted item factor analyses, item response theory analyses, and differential item functioning analyses. Additionally, we evaluated the performance of fixed-item short forms (SFs) and computer adaptive tests (CATs) to efficiently assess the construct. Eighteen items were included in the item banks (15 common across daily and nondaily smokers, 1 unique to daily, 2 unique to nondaily). The item banks are strongly unidimensional, highly reliable (reliability = 0.95 for both), and perform similarly across gender, age, and race/ethnicity groups. A SF common to daily and nondaily smokers consists of 6 items (reliability = 0.86). Results from simulated CATs indicated that, on average, less than 8 items are needed to assess the construct with adequate precision using the item banks. These analyses identified a new set of items that can assess the positive emotional and sensory expectancies of smoking in a reliable and standardized manner. Considerable efficiency in assessing this construct can be achieved by using the item bank SF, employing computer adaptive tests, or selecting subsets of items tailored to specific research or clinical purposes. © The Author 2014. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience

    NASA Astrophysics Data System (ADS)

    Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.

    2016-12-01

    CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)

  15. Health risk characterization of maximum legal exposures for persistent organic pollutant (POP) pesticides in residential soil: An analysis.

    PubMed

    Li, Zijian

    2018-01-01

    Regulations for pesticides in soil are important for controlling human health risk; humans can be exposed to pesticides by ingesting soil, inhaling soil dust, and through dermal contact. Previous studies focused on analyses of numerical standard values for pesticides and evaluated the same pesticide using different standards among different jurisdictions. To understand the health consequences associated with pesticide soil standard values, lifetime theoretical maximum contribution and risk characterization factors were used in this study to quantify the severity of damage using disability-adjusted life years (DALYs) under the maximum "legal" exposure to persistent organic pollutant (POP) pesticides that are commonly regulated by the Stockholm Convention. Results show that computed soil characterization factors for some pesticides present lognormal distributions, and some of them have DALY values higher than 1000.0 per million population (e.g., the DALY for dichlorodiphenyltrichloroethane [DDT] is 14,065 in the Netherlands, which exceeds the tolerable risk of uncertainty upper bound of 1380.0 DALYs). Health risk characterization factors computed from national jurisdictions illustrate that values can vary over eight orders of magnitude. Further, the computed characterization factors can vary over four orders of magnitude within the same national jurisdiction. These data indicate that there is little agreement regarding pesticide soil regulatory guidance values (RGVs) among worldwide national jurisdictions or even RGV standard values within the same jurisdiction. Among these POP pesticides, lindane has the lowest median (0.16 DALYs) and geometric mean (0.28 DALYs) risk characterization factors, indicating that worldwide national jurisdictions provide relatively conservative soil RGVs for lindane. In addition, we found that some European nations and members of the former Union of Soviet Socialist Republics share the same pesticide RGVs and data clusters for the computed characterization factors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Boundary Layer Depth In Coastal Regions

    NASA Astrophysics Data System (ADS)

    Porson, A.; Schayes, G.

    The results of earlier studies performed about sea breezes simulations have shown that this is a relevant feature of the Planetary Boundary Layer that still requires effort to be diagnosed properly by atmospheric models. Based on the observations made during the ESCOMPTE campaign, over the Mediterranean Sea, different CBL and SBL height estimation processes have been tested with a meso-scale model, TVM. The aim was to compare the critical points of the BL height determination computed using turbulent kinetic energy profile with some other standard evaluations. Moreover, these results have been analysed with different mixing length formulation. The sensitivity of formulation is also analysed with a simple coastal configuration.

  17. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    PubMed Central

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326

  18. S-Boxes Based on Affine Mapping and Orbit of Power Function

    NASA Astrophysics Data System (ADS)

    Khan, Mubashar; Azam, Naveed Ahmed

    2015-06-01

    The demand of data security against computational attacks such as algebraic, differential, linear and interpolation attacks has been increased as a result of rapid advancement in the field of computation. It is, therefore, necessary to develop such cryptosystems which can resist current cryptanalysis and more computational attacks in future. In this paper, we present a multiple S-boxes scheme based on affine mapping and orbit of the power function used in Advanced Encryption Standard (AES). The proposed technique results in 256 different S-boxes named as orbital S-boxes. Rigorous tests and comparisons are performed to analyse the cryptographic strength of each of the orbital S-boxes. Furthermore, gray scale images are encrypted by using multiple orbital S-boxes. Results and simulations show that the encryption strength of the orbital S-boxes against computational attacks is better than that of the existing S-boxes.

  19. CMG-biotools, a free workbench for basic comparative microbial genomics.

    PubMed

    Vesth, Tammi; Lagesen, Karin; Acar, Öncel; Ussery, David

    2013-01-01

    Today, there are more than a hundred times as many sequenced prokaryotic genomes than were present in the year 2000. The economical sequencing of genomic DNA has facilitated a whole new approach to microbial genomics. The real power of genomics is manifested through comparative genomics that can reveal strain specific characteristics, diversity within species and many other aspects. However, comparative genomics is a field not easily entered into by scientists with few computational skills. The CMG-biotools package is designed for microbiologists with limited knowledge of computational analysis and can be used to perform a number of analyses and comparisons of genomic data. The CMG-biotools system presents a stand-alone interface for comparative microbial genomics. The package is a customized operating system, based on Xubuntu 10.10, available through the open source Ubuntu project. The system can be installed on a virtual computer, allowing the user to run the system alongside any other operating system. Source codes for all programs are provided under GNU license, which makes it possible to transfer the programs to other systems if so desired. We here demonstrate the package by comparing and analyzing the diversity within the class Negativicutes, represented by 31 genomes including 10 genera. The analyses include 16S rRNA phylogeny, basic DNA and codon statistics, proteome comparisons using BLAST and graphical analyses of DNA structures. This paper shows the strength and diverse use of the CMG-biotools system. The system can be installed on a vide range of host operating systems and utilizes as much of the host computer as desired. It allows the user to compare multiple genomes, from various sources using standardized data formats and intuitive visualizations of results. The examples presented here clearly shows that users with limited computational experience can perform complicated analysis without much training.

  20. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    PubMed Central

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  1. Chest compression rate feedback based on transthoracic impedance.

    PubMed

    González-Otero, Digna M; Ruiz de Gauna, Sofía; Ruiz, Jesus; Daya, Mohamud R; Wik, Lars; Russell, James K; Kramer-Johansen, Jo; Eftestøl, Trygve; Alonso, Erik; Ayala, Unai

    2015-08-01

    Quality of cardiopulmonary resuscitation (CPR) is an important determinant of survival from cardiac arrest. The use of feedback devices is encouraged by current resuscitation guidelines as it helps rescuers to improve quality of CPR performance. To determine the feasibility of a generic algorithm for feedback related to chest compression (CC) rate using the transthoracic impedance (TTI) signal recorded through the defibrillation pads. We analysed 180 episodes collected equally from three different emergency services, each one using a unique defibrillator model. The new algorithm computed the CC-rate every 2s by analysing the TTI signal in the frequency domain. The obtained CC-rate values were compared with the gold standard, computed using the compression force or the ECG and TTI signals when the force was not recorded. The accuracy of the CC-rate, the proportion of alarms of inadequate CC-rate, chest compression fraction (CCF) and the mean CC-rate per episode were calculated. Intervals with CCs were detected with a mean sensitivity and a mean positive predictive value per episode of 96.3% and 97.0%, respectively. Estimated CC-rate had an error below 10% in 95.8% of the time. Mean percentage of accurate alarms per episode was 98.2%. No statistical differences were found between the gold standard and the estimated values for any of the computed metrics. We developed an accurate algorithm to calculate and provide feedback on CC-rate using the TTI signal. This could be integrated into automated external defibrillators and help improve the quality of CPR in basic-life-support settings. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    PubMed

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness.

  3. Cost-effectiveness of PET and PET/computed tomography: a systematic review.

    PubMed

    Gerke, Oke; Hermansson, Ronnie; Hess, Søren; Schifter, Søren; Vach, Werner; Høilund-Carlsen, Poul Flemming

    2015-01-01

    The development of clinical diagnostic procedures comprises early-phase and late-phase studies to elucidate diagnostic accuracy and patient outcome. Economic assessments of new diagnostic procedures compared with established work-ups indicate additional cost for 1 additional unit of effectiveness measure by means of incremental cost-effectiveness ratios when considering the replacement of the standard regimen by a new diagnostic procedure. This article discusses economic assessments of PET and PET/computed tomography reported until mid-July 2014. Forty-seven studies on cancer and noncancer indications were identified but, because of the widely varying scope of the analyses, a substantial amount of work remains to be done. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Numerical Taxonomy of Some Bacteria Isolated from Antarctic and Tropical Seawaters1

    PubMed Central

    Pfister, Robert M.; Burkholder, Paul R.

    1965-01-01

    Pfister, Robert M. (Lamont Geological Observatory, Palisades, N.Y.), and Paul R. Burkholder. Numerical taxonomy of some bacteria isolated from Antarctic and tropical seawaters. J. Bacteriol. 90:863–872. 1965.—Microorganisms from Antarctic seas and from tropical waters near Puerto Rico were examined with a series of morphological, physiological, and biochemical tests. The results of these analyses were coded on punch cards, and similarity matrices were computed with a program for an IBM 1620 computer. When the matrix was reordered by use of the single-linkage technique, and the results were plotted with four symbols for different per cent similarity ranges, nine groups of microorganisms were revealed. The data suggest that organisms occurring in different areas of the open ocean may be profitably studied with standardized computer techniques. PMID:5847807

  5. Computer-based Written Emotional Disclosure: The Effects of Advance or Real-time Guidance and Moderation by Big 5 Personality Traits

    PubMed Central

    Beyer, Jonathan A.; Lumley, Mark A.; Latsch, Deborah A.; Oberleitner, Lindsay M.S.; Carty, Jennifer N.; Radcliffe, Alison M.

    2014-01-01

    Standard written emotional disclosure (WED) about stress, which is private and unguided, yields small health benefits. The effect of providing individualized guidance to writers may enhance WED, but has not been tested. This trial of computer-based WED compared two novel therapist-guided forms of WED—advance guidance (before sessions) or real-time guidance (during sessions, through instant messaging)—to both standard WED and control writing; it also tested Big 5 personality traits as moderators of guided WED. Young adult participants (n = 163) with unresolved stressful experiences were randomized to conditions, had three, 30-min computer-based writing sessions, and were reassessed 6 weeks later. Contrary to hypotheses, real-time guidance WED had poorer outcomes than the other conditions on several measures, and advance guidance WED also showed some poorer outcomes. Moderator analyses revealed that participants with low baseline agreeableness, low extraversion, or high conscientiousness had relatively poor responses to guidance. We conclude that providing guidance for WED, especially in real-time, may interfere with emotional processing of unresolved stress, particularly for people whose personalities have poor fit with this interactive form of WED. PMID:24266598

  6. Break modeling for RELAP5 analyses of ISP-27 Bethsy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petelin, S.; Gortnar, O.; Mavko, B.

    This paper presents pre- and posttest analyses of International Standard Problem (ISP) 27 on the Bethsy facility and separate RELAP5 break model tests considering the measured boundary condition at break inlet. This contribution also demonstrates modifications which have assured the significant improvement of model response in posttest simulations. Calculations were performed using the RELAP5/MOD2/36.05 and RELAP5/MOD3.5M5 codes on the MicroVAX, SUN, and CONVEX computers. Bethsy is an integral test facility that simulates a typical 900-MW (electric) Framatome pressurized water reactor. The ISP-27 scenario involves a 2-in. cold-leg break without HPSI and with delayed operator procedures for secondary system depressurization.

  7. gene GIS: Computational Tools for Spatial Analyses of DNA Profiles with Associated Photo-Identification and Telemetry Records of Marine Mammals

    DTIC Science & Technology

    2011-09-30

    DNA profiles. Referred to as geneGIS, the program will provide the ability to display, browse, select, filter and summarize spatial or temporal...of the SPLASH photo-identification records and available DNA profiles is underway through integration and crosschecking by Cascadia and MMI . An...Darwin Core standards where possible and can accommodate the current databases developed for telemetry data at MMI and SPLASH collection records at

  8. Internal audit in a microbiology laboratory.

    PubMed Central

    Mifsud, A J; Shafi, M S

    1995-01-01

    AIM--To set up a programme of internal laboratory audit in a medical microbiology laboratory. METHODS--A model of laboratory based process audit is described. Laboratory activities were examined in turn by specimen type. Standards were set using laboratory standard operating procedures; practice was observed using a purpose designed questionnaire and the data were analysed by computer; performance was assessed at laboratory audit meetings; and the audit circle was closed by re-auditing topics after an interval. RESULTS--Improvements in performance scores (objective measures) and in staff morale (subjective impression) were observed. CONCLUSIONS--This model of process audit could be applied, with amendments to take local practice into account, in any microbiology laboratory. PMID:7665701

  9. CAD/CAE Integration Enhanced by New CAD Services Standard

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2002-01-01

    A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.

  10. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  11. Molecular mechanics and dynamics characterization of an in silico mutated protein: a stand-alone lab module or support activity for in vivo and in vitro analyses of targeted proteins.

    PubMed

    Chiang, Harry; Robinson, Lucy C; Brame, Cynthia J; Messina, Troy C

    2013-01-01

    Over the past 20 years, the biological sciences have increasingly incorporated chemistry, physics, computer science, and mathematics to aid in the development and use of mathematical models. Such combined approaches have been used to address problems from protein structure-function relationships to the workings of complex biological systems. Computer simulations of molecular events can now be accomplished quickly and with standard computer technology. Also, simulation software is freely available for most computing platforms, and online support for the novice user is ample. We have therefore created a molecular dynamics laboratory module to enhance undergraduate student understanding of molecular events underlying organismal phenotype. This module builds on a previously described project in which students use site-directed mutagenesis to investigate functions of conserved sequence features in members of a eukaryotic protein kinase family. In this report, we detail the laboratory activities of a MD module that provide a complement to phenotypic outcomes by providing a hypothesis-driven and quantifiable measure of predicted structural changes caused by targeted mutations. We also present examples of analyses students may perform. These laboratory activities can be integrated with genetics or biochemistry experiments as described, but could also be used independently in any course that would benefit from a quantitative approach to protein structure-function relationships. Copyright © 2013 Wiley Periodicals, Inc.

  12. The effectiveness of various computer-based interventions for patients with chronic pain or functional somatic syndromes: A systematic review and meta-analysis.

    PubMed

    Vugts, Miel A P; Joosen, Margot C W; van der Geer, Jessica E; Zedlitz, Aglaia M E E; Vrijhoef, Hubertus J M

    2018-01-01

    Computer-based interventions target improvement of physical and emotional functioning in patients with chronic pain and functional somatic syndromes. However, it is unclear to what extent which interventions work and for whom. This systematic review and meta-analysis (registered at PROSPERO, 2016: CRD42016050839) assesses efficacy relative to passive and active control conditions, and explores patient and intervention factors. Controlled studies were identified from MEDLINE, EMBASE, PsychInfo, Web of Science, and Cochrane Library. Pooled standardized mean differences by comparison type, and somatic symptom, health-related quality of life, functional interference, catastrophizing, and depression outcomes were calculated at post-treatment and at 6 or more months follow-up. Risk of bias was assessed. Sub-group analyses were performed by patient and intervention characteristics when heterogeneous outcomes were observed. Maximally, 30 out of 46 eligible studies and 3,387 participants were included per meta-analysis. Mostly, internet-based cognitive behavioral therapies were identified. Significantly higher patient reported outcomes were found in comparisons with passive control groups (standardized mean differences ranged between -.41 and -.18), but not in comparisons with active control groups (SMD = -.26 - -.14). For some outcomes, significant heterogeneity related to patient and intervention characteristics. To conclude, there is a minority of good quality evidence for small positive average effects of computer-based (cognitive) behavior change interventions, similar to traditional modes. These effects may be sustainable. Indications were found as of which interventions work better or more consistently across outcomes for which patients. Future process analyses are recommended in the aim of better understanding individual chances of clinically relevant outcomes.

  13. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  14. Application of transient CFD-procedures for S-shape computation in pump-turbines with and without FSI

    NASA Astrophysics Data System (ADS)

    Casartelli, E.; Mangani, L.; Ryan, O.; Schmid, A.

    2016-11-01

    CFD has entered the product development process in hydraulic machines since more than three decades. Beside the actual design process, in which the most appropriate geometry for a certain task is iteratively sought, several steady-state simulations and related analyses are performed with the help of CFD. Basic transient CFD-analysis is becoming more and more routine for rotor-stator interaction assessment, but in general unsteady CFD is still not standard due to the large computational effort. Especially for FSI simulations, where mesh motion is involved, a considerable amount of computational time is necessary for the mesh handling and deformation as well as the related unsteady flow field resolution. Therefore this kind of CFD computations are still unusual and mostly performed during trouble-shooting analysis rather than in the standard development process, i.e. in order to understand what went wrong instead of preventing failure or even better to increase the available knowledge. In this paper the application of an efficient and particularly robust algorithm for fast computations with moving mesh is presented for the analysis of transient effects encountered during highly dynamic procedures in the operation of a pump-turbine, like runaway at fixed GV position and load-rejection with GV motion imposed as one-way FSI. In both cases the computations extend through the S-shape of the machine in the turbine-brake and reverse pump domain, showing that such exotic computations can be perform on a more regular base, even if quite time consuming. Beside the presentation of the procedure and global results, some highlights in the encountered flow-physics are also given.

  15. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  16. Statistical methods and computing for big data.

    PubMed

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  17. A Rocket Engine Design Expert System

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1989-01-01

    The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state of the art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the H2-O2 coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One dimensional equilibrium chemistry was used in the energy release analysis of the combustion chamber. A 3-D conduction and/or 1-D advection analysis is used to predict heat transfer and coolant channel wall temperature distributions, in addition to coolant temperature and pressure drop. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.

  18. A rocket engine design expert system

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1989-01-01

    The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state-of-the-art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the hydrogen-oxygen coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One-dimensional equilibrium chemistry was employed in the energy release analysis of the combustion chamber and three-dimensional finite-difference analysis of the regenerative cooling channels was used to calculate the pressure drop along the channels and the coolant temperature as it exits the coolant circuit. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.

  19. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  20. Digital pathology in nephrology clinical trials, research, and pathology practice.

    PubMed

    Barisoni, Laura; Hodgin, Jeffrey B

    2017-11-01

    In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.

  1. A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Grando, M Adela; Glasspool, David; Fox, John

    2012-01-01

    To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Influence of counting chamber type on CASA outcomes of equine semen analysis.

    PubMed

    Hoogewijs, M K; de Vliegher, S P; Govaere, J L; de Schauwer, C; de Kruif, A; van Soom, A

    2012-09-01

    Sperm motility is considered to be one of the key features of semen analysis. Assessment of motility is frequently performed using computer-assisted sperm analysis (CASA). Nevertheless, no uniform standards are present to analyse a semen sample using CASA. We hypothesised that the type of counting chamber used might influence the results of analysis and aimed to study the effect of chamber type on estimated concentration and motility of an equine semen sample assessed using CASA. Commonly used disposable Leja chambers of different depths were compared with disposable and reusable ISAS chambers, a Makler chamber and a World Health Organization (WHO) motility slide. Motility parameters and concentrations obtained with CASA using these different chambers were analysed. The NucleoCounter was used as gold standard for determining concentration. Concentration and motility parameters were significantly influenced by the chamber type used. Using the NucleoCounter as the gold standard for determining concentration, the correlation coefficients were low for all of the various chambers evaluated, with the exception of the 12 µm deep Leja chamber. Filling a chamber by capillary forces resulted in a lower observed concentration and reduced motility parameters. All chambers evaluated in this study resulted in significant lower progressive motility than the WHO prepared slide, with the exception of the Makler chamber, which resulted in a slight, but statistically significant, increase in progressive motility estimates. Computer-assisted sperm analysis can only provide a rough estimate of sperm concentration and overestimation is likely when drop-filled slides with a coverslip are used. Motility estimates using CASA are highly influenced by the counting chamber; therefore, a complete description of the chamber type used should be provided in semen reports and in scientific articles. © 2011 EVJ Ltd.

  3. Inconsistency in 9 mm bullets: correlation of jacket thickness to post-impact geometry measured with non-destructive X-ray computed tomography.

    PubMed

    Thornby, John; Landheer, Dirk; Williams, Tim; Barnes-Warden, Jane; Fenne, Paul; Norman, Danielle G; Attridge, Alex; Williams, Mark A

    2014-01-01

    Fundamental to any ballistic armour standard is the reference projectile to be defeated. Typically, for certification purposes, a consistent and symmetrical bullet geometry is assumed, however variations in bullet jacket dimensions can have far reaching consequences. Traditionally, characteristics and internal dimensions have been analysed by physically sectioning bullets--an approach which is of restricted scope and which precludes subsequent ballistic assessment. The use of a non-destructive X-ray computed tomography (CT) method has been demonstrated and validated (Kumar et al., 2011 [15]); the authors now apply this technique to correlate bullet impact response with jacket thickness variations. A set of 20 bullets (9 mm DM11) were selected for comparison and an image-based analysis method was employed to map jacket thickness and determine the centre of gravity of each specimen. Both intra- and inter-bullet variations were investigated, with thickness variations of the order of 200 μm commonly found along the length of all bullets and angular variations of up to 50 μm in some. The bullets were subsequently impacted against a rigid flat plate under controlled conditions (observed on a high-speed video camera) and the resulting deformed projectiles were re-analysed. The results of the experiments demonstrate a marked difference in ballistic performance between bullets from different manufacturers and an asymmetric thinning of the jacket is observed in regions of pre-impact weakness. The conclusions are relevant for future soft armour standards and provide important quantitative data for numerical model correlation and development. The implications of the findings of the work on the reliability and repeatability of the industry standard V50 ballistic test are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    PubMed Central

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Conclusions Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness. PMID:26813512

  5. Computation of Standard Errors

    PubMed Central

    Dowd, Bryan E; Greene, William H; Norton, Edward C

    2014-01-01

    Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304

  6. Computation of incompressible viscous flows through turbopump components

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Chang, Leon

    1993-01-01

    Flow through pump components, such as an inducer and an impeller, is efficiently simulated by solving the incompressible Navier-Stokes equations. The solution method is based on the pseudocompressibility approach and uses an implicit-upwind differencing scheme together with the Gauss-Seidel line relaxation method. the equations are solved in steadily rotating reference frames and the centrifugal force and the Coriolis force are added to the equation of motion. Current computations use a one-equation Baldwin-Barth turbulence model which is derived from a simplified form of the standard k-epsilon model equations. The resulting computer code is applied to the flow analysis inside a generic rocket engine pump inducer, a fuel pump impeller, and SSME high pressure fuel turbopump impeller. Numerical results of inducer flow are compared with experimental measurements. In the fuel pump impeller, the effect of downstream boundary conditions is investigated. Flow analyses at 80 percent, 100 percent, and 120 percent of design conditions are presented.

  7. Robust Optimization Design for Turbine Blade-Tip Radial Running Clearance using Hierarchically Response Surface Method

    NASA Astrophysics Data System (ADS)

    Zhiying, Chen; Ping, Zhou

    2017-11-01

    Considering the robust optimization computational precision and efficiency for complex mechanical assembly relationship like turbine blade-tip radial running clearance, a hierarchically response surface robust optimization algorithm is proposed. The distribute collaborative response surface method is used to generate assembly system level approximation model of overall parameters and blade-tip clearance, and then a set samples of design parameters and objective response mean and/or standard deviation is generated by using system approximation model and design of experiment method. Finally, a new response surface approximation model is constructed by using those samples, and this approximation model is used for robust optimization process. The analyses results demonstrate the proposed method can dramatic reduce the computational cost and ensure the computational precision. The presented research offers an effective way for the robust optimization design of turbine blade-tip radial running clearance.

  8. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  9. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  10. Genomic cloud computing: legal and ethical points to consider

    PubMed Central

    Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Burton, Paul; Chisholm, Rex; Fortier, Isabel; Goodwin, Pat; Harris, Jennifer; Hveem, Kristian; Kaye, Jane; Kent, Alistair; Knoppers, Bartha Maria; Lindpaintner, Klaus; Little, Julian; Riegman, Peter; Ripatti, Samuli; Stolk, Ronald; Bobrow, Martin; Cambon-Thomsen, Anne; Dressler, Lynn; Joly, Yann; Kato, Kazuto; Knoppers, Bartha Maria; Rodriguez, Laura Lyman; McPherson, Treasa; Nicolás, Pilar; Ouellette, Francis; Romeo-Casabona, Carlos; Sarin, Rajiv; Wallace, Susan; Wiesner, Georgia; Wilson, Julia; Zeps, Nikolajs; Simkevitz, Howard; De Rienzo, Assunta; Knoppers, Bartha M

    2015-01-01

    The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key ‘points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These ‘points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure. PMID:25248396

  11. Genomic cloud computing: legal and ethical points to consider.

    PubMed

    Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Knoppers, Bartha M

    2015-10-01

    The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key 'points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These 'points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure.

  12. Graphics processing units in bioinformatics, computational biology and systems biology.

    PubMed

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  13. A bee-hive frequency selective surface for Wi-Max and GPS applications

    NASA Astrophysics Data System (ADS)

    Ray, A.; Kahar, M.; Sarkar, P. P.

    2013-10-01

    The paper presents investigations on a bee-hive cell, concentric aperture frequency selective surface (FSS) tuned to pass 1.5 GHz for global positioning system application and 3.5 GHz for worldwide interoperability for microwave access applications. The designed dual-band FSS screen is easy to fabricate with low cost materials, exhibiting low weight, with two broad transmission bands, where the maximum recorded -10 dB transmission percentage bandwidth is 68.67 %. Due to symmetrical nature of design, FSS is insensitive to variation of RF incidence angle for 60° rotations. A computationally efficient method for analysing this FSS is presented. Experimental investigation is performed using standard microwave test bench. It is observed that the computed and experimental results are in close agreement.

  14. Computational analyses in cognitive neuroscience: in defense of biological implausibility.

    PubMed

    Dror, I E; Gallogly, D P

    1999-06-01

    Because cognitive neuroscience researchers attempt to understand the human mind by bridging behavior and brain, they expect computational analyses to be biologically plausible. In this paper, biologically implausible computational analyses are shown to have critical and essential roles in the various stages and domains of cognitive neuroscience research. Specifically, biologically implausible computational analyses can contribute to (1) understanding and characterizing the problem that is being studied, (2) examining the availability of information and its representation, and (3) evaluating and understanding the neuronal solution. In the context of the distinct types of contributions made by certain computational analyses, the biological plausibility of those analyses is altogether irrelevant. These biologically implausible models are nevertheless relevant and important for biologically driven research.

  15. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    PubMed

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  16. Consumer-based technology for distribution of surgical videos for objective evaluation.

    PubMed

    Gonzalez, Ray; Martinez, Jose M; Lo Menzo, Emanuele; Iglesias, Alberto R; Ro, Charles Y; Madan, Atul K

    2012-08-01

    The Global Operative Assessment of Laparoscopic Skill (GOALS) is one validated metric utilized to grade laparoscopic skills and has been utilized to score recorded operative videos. To facilitate easier viewing of these recorded videos, we are developing novel techniques to enable surgeons to view these videos. The objective of this study is to determine the feasibility of utilizing widespread current consumer-based technology to assist in distributing appropriate videos for objective evaluation. Videos from residents were recorded via a direct connection from the camera processor via an S-video output via a cable into a hub to connect to a standard laptop computer via a universal serial bus (USB) port. A standard consumer-based video editing program was utilized to capture the video and record in appropriate format. We utilized mp4 format, and depending on the size of the file, the videos were scaled down (compressed), their format changed (using a standard video editing program), or sliced into multiple videos. Standard available consumer-based programs were utilized to convert the video into a more appropriate format for handheld personal digital assistants. In addition, the videos were uploaded to a social networking website and video sharing websites. Recorded cases of laparoscopic cholecystectomy in a porcine model were utilized. Compression was required for all formats. All formats were accessed from home computers, work computers, and iPhones without difficulty. Qualitative analyses by four surgeons demonstrated appropriate quality to grade for these formats. Our preliminary results show promise that, utilizing consumer-based technology, videos can be easily distributed to surgeons to grade via GOALS via various methods. Easy accessibility may help make evaluation of resident videos less complicated and cumbersome.

  17. CMG-Biotools, a Free Workbench for Basic Comparative Microbial Genomics

    PubMed Central

    Vesth, Tammi; Lagesen, Karin; Acar, Öncel; Ussery, David

    2013-01-01

    Background Today, there are more than a hundred times as many sequenced prokaryotic genomes than were present in the year 2000. The economical sequencing of genomic DNA has facilitated a whole new approach to microbial genomics. The real power of genomics is manifested through comparative genomics that can reveal strain specific characteristics, diversity within species and many other aspects. However, comparative genomics is a field not easily entered into by scientists with few computational skills. The CMG-biotools package is designed for microbiologists with limited knowledge of computational analysis and can be used to perform a number of analyses and comparisons of genomic data. Results The CMG-biotools system presents a stand-alone interface for comparative microbial genomics. The package is a customized operating system, based on Xubuntu 10.10, available through the open source Ubuntu project. The system can be installed on a virtual computer, allowing the user to run the system alongside any other operating system. Source codes for all programs are provided under GNU license, which makes it possible to transfer the programs to other systems if so desired. We here demonstrate the package by comparing and analyzing the diversity within the class Negativicutes, represented by 31 genomes including 10 genera. The analyses include 16S rRNA phylogeny, basic DNA and codon statistics, proteome comparisons using BLAST and graphical analyses of DNA structures. Conclusion This paper shows the strength and diverse use of the CMG-biotools system. The system can be installed on a vide range of host operating systems and utilizes as much of the host computer as desired. It allows the user to compare multiple genomes, from various sources using standardized data formats and intuitive visualizations of results. The examples presented here clearly shows that users with limited computational experience can perform complicated analysis without much training. PMID:23577086

  18. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  19. Justification of CT scans using referral guidelines for imaging.

    PubMed

    Stanescu, G; Rosca-Fartat, G; Stanescu, D

    2015-07-01

    This study analyses the efficiency of the justification of individual computed tomography (CT) procedures using the good practice guide. The conformity of the CT scans with guide's recommendations was retrospectively analysed in a paediatric emergency hospital in Romania. The involved patient doses were estimated. The results show that around one-third of the examinations were not prescribed in conformity with the guide's recommendations, but these results are affected by unclear guide provisions, discussed here. The implications of the provisions of the revised International Atomic Energy Agency's Basic Safety Standards and of the Council Directive 2013/59/EURATOM were analysed. The education and training courses for medical doctors disseminating the provisions of the good practice guide should be considered as the main support for the justification of the CT scans at the individual level. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. 76 FR 79609 - Federal Acquisition Regulation; Clarification of Standards for Computer Generation of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... Regulation; Clarification of Standards for Computer Generation of Forms AGENCY: Department of Defense (DoD... American National Standards Institute X12, as the valid standard to use for computer-generated forms. FAR... optional forms on their computers. In addition to clarifying that FIPS 161 is no longer in use, public...

  1. An evaluation of nasal bone and aperture shape among three South African populations.

    PubMed

    McDowell, Jennifer L; Kenyhercz, Michael W; L'Abbé, Ericka N

    2015-07-01

    Reliable and valid population specific standards are necessary to accurately develop a biological profile, which includes an estimation of peer-reported social identification (Hefner, 2009). During the last 300 years, colonialism, slavery and apartheid created geographic, physical and social divisions of population groups in South Africa. The purpose of this study was to evaluate variation in nasal bone and aperture shape in a modern population of black, white, and coloured South Africans using standard craniometric variables and geometric morphometrics, namely general Procrustes and elliptical Fourier analyses. Fourteen standard landmarks were digitally recorded or computationally derived from 310 crania using a 3D coordinate digitizer for discriminant function, principal components and generalized Procrustes analyses. For elliptical Fourier analysis, outlines of the nasal aperture were generated from standardized photographs. All classification accuracies were better than chance; the lowest accuracies were for coloured and the highest accuracies were for white South Africans. Most difficulties arose in distinguishing coloured and black South African groups from each other. Generally, misclassifications were noted between the sexes within each group rather than among groups, which suggests that sex has less influence on nasal bone and aperture shape than ancestry. Quantifiable variation in shape of the nasal aperture region between white and non-white South African groups was observed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Solar Thermal Upper Stage Liquid Hydrogen Pressure Control Testing and Analytical Modeling

    NASA Technical Reports Server (NTRS)

    Olsen, A. D.; Cady, E. C.; Jenkins, D. S.; Chandler, F. O.; Grayson, G. D.; Lopez, A.; Hastings, L. J.; Flachbart, R. H.; Pedersen, K. W.

    2012-01-01

    The demonstration of a unique liquid hydrogen (LH2) storage and feed system concept for solar thermal upper stage was cooperatively accomplished by a Boeing/NASA Marshall Space Flight Center team. The strategy was to balance thermodynamic venting with the engine thrusting timeline during a representative 30-day mission, thereby, assuring no vent losses. Using a 2 cubic m (71 cubic ft) LH2 tank, proof-of-concept testing consisted of an engineering checkout followed by a 30-day mission simulation. The data were used to anchor a combination of standard analyses and computational fluid dynamics (CFD) modeling. Dependence on orbital testing has been incrementally reduced as CFD codes, combined with standard modeling, continue to be challenged with test data such as this.

  3. Contracting for Computer Software in Standardized Computer Languages

    PubMed Central

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  4. Single Top Production at Next-to-Leading Order in the Standard Model Effective Field Theory.

    PubMed

    Zhang, Cen

    2016-04-22

    Single top production processes at hadron colliders provide information on the relation between the top quark and the electroweak sector of the standard model. We compute the next-to-leading order QCD corrections to the three main production channels: t-channel, s-channel, and tW associated production, in the standard model including operators up to dimension six. The calculation can be matched to parton shower programs and can therefore be directly used in experimental analyses. The QCD corrections are found to significantly impact the extraction of the current limits on the operators, because both of an improved accuracy and a better precision of the theoretical predictions. In addition, the distributions of some of the key discriminating observables are modified in a nontrivial way, which could change the interpretation of measurements in terms of UV complete models.

  5. Improvement of the mechanical reliability of monolithic refractory linings for coal gasification process vessels. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, R.A.

    1981-09-01

    Eighteen heat-up tests were run on nine standard and experimental dual component monolithic refractory concrete linings. These tests were run with a five foot diameter by 14-ft high Pressure Vessel/Test Furnace designed to accommodate a 12-inch thick by 5-ft high refractory lining, heat the hot face to 2000/sup 0/F and expose the lining to air or steam pressures up to 150 psig. Results obtained from standard type linings in the test facility indicated that lining degradation duplicated that observed in field installations. The lining performance was significantly improved due to information gained from a systematic study of the cracking thatmore » occurred in the linings; the analysis of the lining strains, shell stresses and acoustic emission results; and the stress analyses performed on the standard and experimental lining designs with the finite element analysis computer programs, REFSAM and RESGAP.« less

  6. Assessment of Alternative [U] and [Th] Zircon Standards for SIMS

    NASA Astrophysics Data System (ADS)

    Monteleone, B. D.; van Soest, M. C.; Hodges, K.; Moore, G. M.; Boyce, J. W.; Hervig, R. L.

    2009-12-01

    The quality of in situ (U-Th)/He zircon dates is dependent upon the accuracy and precision of spatially distributed [U] and [Th] measurements on often complexly zoned zircon crystals. Natural zircon standards for SIMS traditionally have been used to obtain precise U-Pb ages rather than precise U and Th concentration. [U] and [Th] distributions within even the most homogeneous U-Pb age standards are not sufficient to make good microbeam standards (i.e., yield good precision: 2σ < 5%) for (U-Th)/He dates. In the absence of sufficiently homogeneous natural zircon crystals, we evaluate the use of the NIST 610 glass standard and a synthetic polycrystalline solid “zircon synrock” made by powdering and pressing natural zircon crystals at 2 GPa and 1100°C within a 13 mm piston cylinder for 24 hours. SIMS energy spectra and multiple spot analyses help assess the matrix-dependence of secondary ion emission and [U] and [Th] homogeneity of these materials. Although spot analyses on NIST 610 glass yielded spatially consistent ratios of 238U/30Si and 232Th/30Si (2σ = 2%, n = 14), comparison of energy spectra collected on glass and zircon reveal significant differences in U, UO, Th, and ThO ion intensities over the range of initial kinetic energies commonly used for trace element analyses. Computing [U] and [Th] in zircon using NIST glass yields concentrations that vary by more than 10% for [U] and [Th], depending on the initial kinetic energy and ion mass (elemental, oxide, or sum of elemental and oxide) used for the analysis. The observed effect of chemistry on secondary ion energy spectra suggests that NIST glass cannot be used as a standard for trace [U] and [Th] in zircon without a correction factor (presently unknown). Energy spectra of the zircon synrock are similar to those of natural zircon, suggesting matrix compatibility and therefore potential for accurate standardization. Spot analyses on the zircon powder pellets, however, show that adequate homogeneity of [U] and [Th] (2σ = 37% and 33% for 238U/30Si and 232Th/30Si, respectively, n = 8) has yet to be achieved. Modeling shows that homogenization of [U] and [Th] within these pellets requires preparation of powders with <2 micron sized particles, which has yet to be achieved in sample preparation. Thus, the zircon synrock pellet remains a viable potential [U], [Th] standard, although the preparation of a sufficiently fine grained, homogeneous pellet is a work in progress.

  7. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    PubMed

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  8. Underestimating extreme events in power-law behavior due to machine-dependent cutoffs

    NASA Astrophysics Data System (ADS)

    Radicchi, Filippo

    2014-11-01

    Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.

  9. 76 FR 7817 - Announcing Draft Federal Information Processing Standard 180-4, Secure Hash Standard, and Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... before May 12, 2011. ADDRESSES: Written comments may be sent to: Chief, Computer Security Division... FURTHER INFORMATION CONTACT: Elaine Barker, Computer Security Division, National Institute of Standards... Quynh Dang, Computer Security Division, National Institute of Standards and Technology, Gaithersburg, MD...

  10. Simulation of Blood flow in Artificial Heart Valve Design through Left heart

    NASA Astrophysics Data System (ADS)

    Hafizah Mokhtar, N.; Abas, Aizat

    2018-05-01

    In this work, an artificial heart valve is designed for use in real heart with further consideration on the effect of thrombosis, vorticity, and stress. The design of artificial heart valve model is constructed by Computer-aided design (CAD) modelling and simulated using Computational fluid dynamic (CFD) software. The effect of blood flow pattern, velocity and vorticity of the artificial heart valve design has been analysed in this research work. Based on the results, the artificial heart valve design shows that it has a Doppler velocity index that is less than the allowable standards for the left heart with values of more than 0.30 and less than 2.2. These values are safe to be used as replacement of the human heart valve.

  11. Energy Frontier Research With ATLAS: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, John; Black, Kevin; Ahlen, Steve

    2016-06-14

    The Boston University (BU) group is playing key roles across the ATLAS experiment: in detector operations, the online trigger, the upgrade, computing, and physics analysis. Our team has been critical to the maintenance and operations of the muon system since its installation. During Run 1 we led the muon trigger group and that responsibility continues into Run 2. BU maintains and operates the ATLAS Northeast Tier 2 computing center. We are actively engaged in the analysis of ATLAS data from Run 1 and Run 2. Physics analyses we have contributed to include Standard Model measurements (W and Z cross sections,more » t\\bar{t} differential cross sections, WWW^* production), evidence for the Higgs decaying to \\tau^+\\tau^-, and searches for new phenomena (technicolor, Z' and W', vector-like quarks, dark matter).« less

  12. Use of computed tomography as a non-invasive method for diagnosing cephenemyiosis in roe deer (Capreolus capreolus).

    PubMed

    Fidalgo, L E; López-Beceiro, A M; Vila-Pastor, M; Martínez-Carrasco, C; Barreiro-Vázquez, J D; Pérez, J M

    2015-03-01

    This study was conducted to assess the reliability of computed tomography (CT) for diagnosing bot fly infestations by Cephenemyia stimulator (Clark) (Diptera: Oestridae) in roe deer (Capreolus capreolus L.) (Artiodactyla: Cervidae). For this purpose, the heads of 30 animals were analysed, firstly by CT and then by necropsy, which was used as the reference standard method. The prevalence values obtained by both methods were identical; the prevalence of infestation was 40.0% overall, and was higher in males (45.5%) than in females (25.0%). These results highlight the usefulness of CT as an alternative or non-invasive method for diagnosing cephenemyiosis in live-captured roe deer and in hunting trophies or museum collections that cannot be destroyed or damaged. © 2014 The Royal Entomological Society.

  13. The remote, the mouse, and the no. 2 pencil: the household media environment and academic achievement among third grade students.

    PubMed

    Borzekowski, Dina L G; Robinson, Thomas N

    2005-07-01

    Media can influence aspects of a child's physical, social, and cognitive development; however, the associations between a child's household media environment, media use, and academic achievement have yet to be determined. To examine relationships among a child's household media environment, media use, and academic achievement. During a single academic year, data were collected through classroom surveys and telephone interviews from an ethnically diverse sample of third grade students and their parents from 6 northern California public elementary schools. The majority of our analyses derive from spring 2000 data, including academic achievement assessed through the mathematics, reading, and language arts sections of the Stanford Achievement Test. We fit linear regression models to determine the associations between variations in household media and performance on the standardized tests, adjusting for demographic and media use variables. The household media environment is significantly associated with students' performance on the standardized tests. It was found that having a bedroom television set was significantly and negatively associated with students' test scores, while home computer access and use were positively associated with the scores. Regression models significantly predicted up to 24% of the variation in the scores. Absence of a bedroom television combined with access to a home computer was consistently associated with the highest standardized test scores. This study adds to the growing literature reporting that having a bedroom television set may be detrimental to young elementary school children. It also suggests that having and using a home computer may be associated with better academic achievement.

  14. Norms and Standards for Computer Education (MCA, BCA) through Distance Mode.

    ERIC Educational Resources Information Center

    Rausaria, R.R., Ed.; Lele, Nalini A., Ed.; Bhushan, Bharat, Ed.

    This document presents the norms and standards for computer education in India through distance mode, including the Masters in Computer Applications (MCA) and Bachelor in Computer Applications (BCA) programs. These norms and standards were considered and approved by the Distance Education Council, Indira Gandhi National Open University (India), at…

  15. Some methodical peculiarities of analysis of small-mass samples by SRXFA

    NASA Astrophysics Data System (ADS)

    Kudryashova, A. F.; Tarasov, L. S.; Ulyanov, A. A.; Baryshev, V. B.

    1989-10-01

    The stability of work of the element analysis station on the storage rings VEPP-3 and VEPP-4 in INP (Novosibirsk, USSR) was demonstrated on the example of three sets of rare element analyses carried out by SRXFA in May 1985, January and May-June 1988. These data show that there are some systematic deviations in the results of measurements of Zr and La contents. SRXFA and INAA data have been compared for the latter element. A false linear correlation on the Rb-Sr plot in one set of analyses has been attributed to an overlapping artificial Sr peak on a Rb peak. The authors proposed sequences of registration of spectra and computer treatment for samples and standards. Such sequences result in better final concentration data.

  16. MIUS community conceptual design study

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1976-01-01

    The feasibility, practicality, and applicability of the modular integrated utility systems (MIUS) concept to a satellite new-community development with a population of approximately 100,000 were analyzed. Two MIUS design options, the 29-MIUS-unit (option 1) and the 8-MIUS-unit (option 2) facilities were considered. Each resulted in considerable resource savings when compared to a conventional utility system. Economic analyses indicated that the total cash outlay and operations and maintenance costs for these two options were considerably less than for a conventional system. Computer analyses performed in support of this study provided corroborative data for the study group. An environmental impact assessment was performed to determine whether the MIUS meets or will meet necessary environmental standards. The MIUS can provide improved efficiency in the conservation of natural resources while not adversely affecting the physical environment.

  17. A Model for Simulating the Response of Aluminum Honeycomb Structure to Transverse Loading

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Czabaj, Michael W.; Jackson, Wade C.

    2012-01-01

    A 1-dimensional material model was developed for simulating the transverse (thickness-direction) loading and unloading response of aluminum honeycomb structure. The model was implemented as a user-defined material subroutine (UMAT) in the commercial finite element analysis code, ABAQUS(Registered TradeMark)/Standard. The UMAT has been applied to analyses for simulating quasi-static indentation tests on aluminum honeycomb-based sandwich plates. Comparison of analysis results with data from these experiments shows overall good agreement. Specifically, analyses of quasi-static indentation tests yielded accurate global specimen responses. Predicted residual indentation was also in reasonable agreement with measured values. Overall, this simple model does not involve a significant computational burden, which makes it more tractable to simulate other damage mechanisms in the same analysis.

  18. Quantitative assessment of commercial filter 'aids' for red-green colour defectives.

    PubMed

    Moreland, Jack D; Westland, Steven; Cheung, Vien; Dain, Steven J

    2010-09-01

    The claims made for 43 commercial filter 'aids', that they improve the colour discrimination of red-green colour defectives, are assessed for protanomaly and deuteranomaly by changes in the colour spacing of traffic signals (European Standard EN 1836:2005) and of the Farnsworth D15 test. Spectral transmittances of the 'aids' are measured and tristimulus values with and without 'aids' are computed using cone fundamentals and the spectral power distributions of either the D15 chips illuminated by CIE Illuminant C or of traffic signals. Chromaticities (l,s) are presented in cone excitation diagrams for protanomaly and deuteranomaly in terms of the relative excitation of their long (L), medium (M) and short (S) wavelength-sensitive cones. After correcting for non-uniform colour spacing in these diagrams, standard deviations parallel to the l and s axes are computed and enhancement factors E(l) and E(s) are derived as the ratio of 'aided' to 'unaided' standard deviations. Values of E(l) for traffic signals with most 'aids' are <1 and many do not meet the European signal detection standard. A few 'aids' have expansive E(l) factors but with inadequate utility: the largest being 1.2 for traffic signals and 1.3 for the D15 colours. Analyses, replicated for 19 'aids' from one manufacturer using 658 Munsell colours inside the D15 locus, yield E(l) factors within 1% of those found for the 16 D15 colours. © 2010 The Authors, Ophthalmic and Physiological Optics © 2010 The College of Optometrists.

  19. [Automated procedures for microscopic analyses of blood smears: medical testing a MECOS-Ts2 complex].

    PubMed

    Pliasunova, S A; Balugian, R Sh; Khmel'nitskiĭ, K E; Medovyĭ, V S; Parpara, A A; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Dem'ianov, V L; Nikolaenko, D S

    2006-10-01

    The paper presents the results of medical tests of a group of computer-aided procedures for microscopic analysis by means of a MECOS-Ts2 complex (ZAO "MECOS", Russia), which have been conducted at the Republican Children's Clinical Hospital, the Research Institute of Emergency Pediatric Surgery and Traumatology, and Moscow City Clinical Hospital No. 23. Computer-aided procedures for calculating the differential count and for analyzing the morphology of red blood cells were tested on blood smears from a total of 443 patients and donors, computer-aided calculation of the count of reticulocytes was tested on 318 smears. The tests were carried out under the US standard NCCLS-H20A. Manual microscopy (443 smears) and flow blood analysis on a Coulter GEN*S (125 smears) were used as reference methods. The quality of collection of samples and laboriousness were additionally assessed. The certified MECOS-Ts2 subsystems were additionally used as reference tools. The tests indicated the advantage of computer-aided MECOS-Tsl2 complex microscopy over manual microscopy.

  20. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  1. Brian: a simulator for spiking neural networks in python.

    PubMed

    Goodman, Dan; Brette, Romain

    2008-01-01

    "Brian" is a new simulator for spiking neural networks, written in Python (http://brian. di.ens.fr). It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.

  2. The European computer driving licence and the use of computers by dental students.

    PubMed

    Antonarakis, G S

    2009-02-01

    The use of computers within the dental curriculum for students is vital for many aspects of their studies. The aim of this study was to assess how dental students who had obtained the European computer driving licence (ECDL) qualification (an internationally-recognised standard of competence) through taught courses, felt about the qualification, and how it changed their habits vis-à-vis computers, and information and communication technology. This study was carried out as a descriptive, one-off, cross-sectional survey. A questionnaire was distributed to 100 students who had successfully completed the course, with questions pertaining to the use of email, word processing and Internet for course-works, Medline for research, computer based learning, online lecture notes, and online communication with members of staff, both before and after ECDL qualification. Scaled responses were given. The attitudes of students towards the course were also assessed. The frequencies and percentage distributions of the responses to each question were analysed. It was found that dental students who follow ECDL teaching and successfully complete its requirements, seem to increase the frequency with which they use email, word processing and Internet for course works, Medline for research purposes, computer based learning, online lecture notes, and online communication with staff. Opinions about the ECDL course varied, many dental students finding the course easy, enjoying it only a little, but admitting that it improved their computer skills.

  3. Automated Hypothesis Tests and Standard Errors for Nonstandard Problems with Description of Computer Package: A Draft.

    ERIC Educational Resources Information Center

    Lord, Frederic M.; Stocking, Martha

    A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…

  4. TLIFE: a Program for Spur, Helical and Spiral Bevel Transmission Life and Reliability Modeling

    NASA Technical Reports Server (NTRS)

    Savage, M.; Prasanna, M. G.; Rubadeux, K. L.

    1994-01-01

    This report describes a computer program, 'TLIFE', which models the service life of a transmission. The program is written in ANSI standard Fortran 77 and has an executable size of about 157 K bytes for use on a personal computer running DOS. It can also be compiled and executed in UNIX. The computer program can analyze any one of eleven unit transmissions either singly or in a series combination of up to twenty-five unit transmissions. Metric or English unit calculations are performed with the same routines using consistent input data and a units flag. Primary outputs are the dynamic capacity of the transmission and the mean lives of the transmission and of the sum of its components. The program uses a modular approach to separate the load analyses from the system life calculations. The program and its input and output data files are described herein. Three examples illustrate its use. A development of the theory behind the analysis in the program is included after the examples.

  5. Guidelines and standard procedures for continuous water-quality monitors: Site selection, field operation, calibration, record computation, and reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Mattraw, Harold C.; Ritz, George F.; Smith, Brett A.

    2000-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess variations in the quality of the Nation's surface water. A common system configuration for data collection is the four-parameter water-quality monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data, although systems can be configured to measure other properties such as turbidity or chlorophyll. The sensors that are used to measure these water properties require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. Data from sensors can be used in conjunction with collected samples and chemical analyses to estimate chemical loads. This report provides guidelines for site-selection considerations, sensor test methods, field procedures, error correction, data computation, and review and publication processes. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  6. StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab

    NASA Astrophysics Data System (ADS)

    Grund, Michael

    2017-04-01

    The SplitLab package (Wüstefeld et al., Computers and Geosciences, 2008), written in MATLAB, is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to seaside or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure.

  7. Trends of mortality from Alzheimer's disease in the European Union, 1994-2013.

    PubMed

    Niu, H; Alvarez-Alvarez, I; Guillen-Grima, F; Al-Rahamneh, M J; Aguinaga-Ontoso, I

    2017-06-01

    In many countries, Alzheimer's disease (AD) has gradually become a common disease in elderly populations. The aim of this study was to analyse trends of mortality caused by AD in the 28 member countries in the European Union (EU) over the last two decades. We extracted data for AD deaths for the period 1994-2013 in the EU from the Eurostat and World Health Organization database. Age-standardized mortality rates per 100 000 were computed. Joinpoint regression was used to analyse the trends and compute the annual percent change in the EU as a whole and by country. Analyses by gender and by European regions were conducted. Mortality from AD has risen in the EU throughout the study period. Most of the countries showed upward trends, with the sharpest increases in Slovakia, Lithuania and Romania. We recorded statistically significant increases of 4.7% and 6.0% in mortality rates in men and women, respectively, in the whole EU. Several countries showed changing trends during the study period. According to the regional analysis, northern and eastern countries showed the steepest increases, whereas in the latter years mortality has declined in western countries. Our findings provide evidence that AD mortality has increased in the EU, especially in eastern and northern European countries and in the female population. Our results could be a reference for the development of primary prevention policies. © 2017 EAN.

  8. A Web-Based Development Environment for Collaborative Data Analysis

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  9. A Comparison of Three Methods for Computing Scale Score Conditional Standard Errors of Measurement. ACT Research Report Series, 2013 (7)

    ERIC Educational Resources Information Center

    Woodruff, David; Traynor, Anne; Cui, Zhongmin; Fang, Yu

    2013-01-01

    Professional standards for educational testing recommend that both the overall standard error of measurement and the conditional standard error of measurement (CSEM) be computed on the score scale used to report scores to examinees. Several methods have been developed to compute scale score CSEMs. This paper compares three methods, based on…

  10. Factors influencing use of an e-health website in a community sample of older adults.

    PubMed

    Czaja, Sara J; Sharit, Joseph; Lee, Chin Chin; Nair, Sankaran N; Hernández, Mario A; Arana, Neysarí; Fu, Shih Hua

    2013-01-01

    The use of the internet as a source of health information and link to healthcare services has raised concerns about the ability of consumers, especially vulnerable populations such as older adults, to access these applications. This study examined the influence of training on the ability of adults (aged 45+ years) to use the Medicare.gov website to solve problems related to health management. The influence of computer experience and cognitive abilities on performance was also examined. Seventy-one participants, aged 47-92, were randomized into a Multimedia training, Unimodal training, or Cold Start condition and completed three healthcare management problems. MEASUREMENT AND ANALYSES: Computer/internet experience was measured via questionnaire, and cognitive abilities were assessed using standard neuropsychological tests. Performance metrics included measures of navigation, accuracy and efficiency. Data were analyzed using analysis of variance, χ(2) and regression techniques. The data indicate that there was no difference among the three conditions on measures of accuracy, efficiency, or navigation. However, results of the regression analyses showed that, overall, people who received training performed better on the tasks, as evidenced by greater accuracy and efficiency. Performance was also significantly influenced by prior computer experience and cognitive abilities. Participants with more computer experience and higher cognitive abilities performed better. The findings indicate that training, experience, and abilities are important when using complex health websites. However, training alone is not sufficient. The complexity of web content needs to be considered to ensure successful use of these websites by those with lower abilities.

  11. Factors influencing use of an e-health website in a community sample of older adults

    PubMed Central

    Sharit, Joseph; Lee, Chin Chin; Nair, Sankaran N; Hernández, Mario A; Arana, Neysarí; Fu, Shih Hua

    2013-01-01

    Objective The use of the internet as a source of health information and link to healthcare services has raised concerns about the ability of consumers, especially vulnerable populations such as older adults, to access these applications. This study examined the influence of training on the ability of adults (aged 45+ years) to use the Medicare.gov website to solve problems related to health management. The influence of computer experience and cognitive abilities on performance was also examined. Design Seventy-one participants, aged 47–92, were randomized into a Multimedia training, Unimodal training, or Cold Start condition and completed three healthcare management problems. Measurement and analyses Computer/internet experience was measured via questionnaire, and cognitive abilities were assessed using standard neuropsychological tests. Performance metrics included measures of navigation, accuracy and efficiency. Data were analyzed using analysis of variance, χ2 and regression techniques. Results The data indicate that there was no difference among the three conditions on measures of accuracy, efficiency, or navigation. However, results of the regression analyses showed that, overall, people who received training performed better on the tasks, as evidenced by greater accuracy and efficiency. Performance was also significantly influenced by prior computer experience and cognitive abilities. Participants with more computer experience and higher cognitive abilities performed better. Conclusions The findings indicate that training, experience, and abilities are important when using complex health websites. However, training alone is not sufficient. The complexity of web content needs to be considered to ensure successful use of these websites by those with lower abilities. PMID:22802269

  12. CUGatesDensity—Quantum circuit analyser extended to density matrices

    NASA Astrophysics Data System (ADS)

    Loke, T.; Wang, J. B.

    2013-12-01

    CUGatesDensity is an extension of the original quantum circuit analyser CUGates (Loke and Wang, 2011) [7] to provide explicit support for the use of density matrices. The new package enables simulation of quantum circuits involving statistical ensemble of mixed quantum states. Such analysis is of vital importance in dealing with quantum decoherence, measurements, noise and error correction, and fault tolerant computation. Several examples involving mixed state quantum computation are presented to illustrate the use of this package. Catalogue identifier: AEPY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5368 No. of bytes in distributed program, including test data, etc.: 143994 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer installed with a copy of Mathematica 6.0 or higher. Operating system: Any system with a copy of Mathematica 6.0 or higher installed. Classification: 4.15. Nature of problem: To simulate arbitrarily complex quantum circuits comprised of single/multiple qubit and qudit quantum gates with mixed state registers. Solution method: A density matrix representation for mixed states and a state vector representation for pure states are used. The construct is based on an irreducible form of matrix decomposition, which allows a highly efficient implementation of general controlled gates with multiple conditionals. Running time: The examples provided in the notebook CUGatesDensity.nb take approximately 30 s to run on a laptop PC.

  13. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  14. Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams.

    PubMed

    Valous, Nektarios A; Mendoza, Fernando; Sun, Da-Wen; Allen, Paul

    2009-01-01

    Due to the high variability and complex colour distribution in meats and meat products, the colour signal calibration of any computer vision system used for colour quality evaluations, represents an essential condition for objective and consistent analyses. This paper compares two methods for CIE colour characterization using a computer vision system (CVS) based on digital photography; namely the polynomial transform procedure and the transform proposed by the sRGB standard. Also, it presents a procedure for evaluating the colour appearance and presence of pores and fat-connective tissue on pre-sliced hams made from pork, turkey and chicken. Our results showed high precision, in colour matching, for device characterization when the polynomial transform was used to match the CIE tristimulus values in comparison with the sRGB standard approach as indicated by their ΔE(ab)(∗) values. The [3×20] polynomial transfer matrix yielded a modelling accuracy averaging below 2.2 ΔE(ab)(∗) units. Using the sRGB transform, high variability was appreciated among the computed ΔE(ab)(∗) (8.8±4.2). The calibrated laboratory CVS, implemented with a low-cost digital camera, exhibited reproducible colour signals in a wide range of colours capable of pinpointing regions-of-interest and allowed the extraction of quantitative information from the overall ham slice surface with high accuracy. The extracted colour and morphological features showed potential for characterizing the appearance of ham slice surfaces. CVS is a tool that can objectively specify colour and appearance properties of non-uniformly coloured commercial ham slices.

  15. Computational identification and validation of alternative splicing in ZSF1 rat RNA-seq data, a preclinical model for type 2 diabetic nephropathy.

    PubMed

    Zhang, Chi; Dower, Ken; Zhang, Baohong; Martinez, Robert V; Lin, Lih-Ling; Zhao, Shanrong

    2018-05-16

    Obese ZSF1 rats exhibit spontaneous time-dependent diabetic nephropathy and are considered to be a highly relevant animal model of progressive human diabetic kidney disease. We previously identified gene expression changes between disease and control animals across six time points from 12 to 41 weeks. In this study, the same data were analysed at the isoform and exon levels to reveal additional disease mechanisms that may be governed by alternative splicing. Our analyses identified alternative splicing patterns in genes that may be implicated in disease pathogenesis (such as Shc1, Serpinc1, Epb4.1l5, and Il-33), which would have been overlooked in standard gene-level analysis. The alternatively spliced genes were enriched in pathways related to cell adhesion, cell-cell interactions/junctions, and cytoskeleton signalling, whereas the differentially expressed genes were enriched in pathways related to immune response, G protein-coupled receptor, and cAMP signalling. Our findings indicate that additional mechanistic insights can be gained from exon- and isoform-level data analyses over standard gene-level analysis. Considering alternative splicing is poorly conserved between rodents and humans, it is noted that this work is not translational, but the point holds true that additional insights can be gained from alternative splicing analysis of RNA-seq data.

  16. Condor-COPASI: high-throughput computing for biochemical networks

    PubMed Central

    2012-01-01

    Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945

  17. The computation of equating errors in international surveys in education.

    PubMed

    Monseur, Christian; Berezner, Alla

    2007-01-01

    Since the IEA's Third International Mathematics and Science Study, one of the major objectives of international surveys in education has been to report trends in achievement. The names of the two current IEA surveys reflect this growing interest: Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS). Similarly a central concern of the OECD's PISA is with trends in outcomes over time. To facilitate trend analyses these studies link their tests using common item equating in conjunction with item response modelling methods. IEA and PISA policies differ in terms of reporting the error associated with trends. In IEA surveys, the standard errors of the trend estimates do not include the uncertainty associated with the linking step while PISA does include a linking error component in the standard errors of trend estimates. In other words, PISA implicitly acknowledges that trend estimates partly depend on the selected common items, while the IEA's surveys do not recognise this source of error. Failing to recognise the linking error leads to an underestimation of the standard errors and thus increases the Type I error rate, thereby resulting in reporting of significant changes in achievement when in fact these are not significant. The growing interest of policy makers in trend indicators and the impact of the evaluation of educational reforms appear to be incompatible with such underestimation. However, the procedure implemented by PISA raises a few issues about the underlying assumptions for the computation of the equating error. After a brief introduction, this paper will describe the procedure PISA implemented to compute the linking error. The underlying assumptions of this procedure will then be discussed. Finally an alternative method based on replication techniques will be presented, based on a simulation study and then applied to the PISA 2000 data.

  18. Determining the spill flow discharge of combined sewer overflows using rating curves based on computational fluid dynamics instead of the standard weir equation.

    PubMed

    Fach, S; Sitzenfrei, R; Rauch, W

    2009-01-01

    It is state of the art to evaluate and optimise sewer systems with urban drainage models. Since spill flow data is essential in the calibration process of conceptual models it is important to enhance the quality of such data. A wide spread approach is to calculate the spill flow volume by using standard weir equations together with measured water levels. However, these equations are only applicable to combined sewer overflow (CSO) structures, whose weir constructions correspond with the standard weir layout. The objective of this work is to outline an alternative approach to obtain spill flow discharge data based on measurements with a sonic depth finder. The idea is to determine the relation between water level and rate of spill flow by running a detailed 3D computational fluid dynamics (CFD) model. Two real world CSO structures have been chosen due to their complex structure, especially with respect to the weir construction. In a first step the simulation results were analysed to identify flow conditions for discrete steady states. It will be shown that the flow conditions in the CSO structure change after the spill flow pipe acts as a controlled outflow and therefore the spill flow discharge cannot be described with a standard weir equation. In a second step the CFD results will be used to derive rating curves which can be easily applied in everyday practice. Therefore the rating curves are developed on basis of the standard weir equation and the equation for orifice-type outlets. Because the intersection of both equations is not known, the coefficients of discharge are regressed from CFD simulation results. Furthermore, the regression of the CFD simulation results are compared with the one of the standard weir equation by using historic water levels and hydrographs generated with a hydrodynamic model. The uncertainties resulting of the wide spread use of the standard weir equation are demonstrated.

  19. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  20. A Statistical Method for Synthesizing Mediation Analyses Using the Product of Coefficient Approach Across Multiple Trials

    PubMed Central

    Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks

    2016-01-01

    Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330

  1. Electronic trigger for capacitive touchscreen and extension of ISO 15781 standard time lag measurements to smartphones

    NASA Astrophysics Data System (ADS)

    Bucher, François-Xavier; Cao, Frédéric; Viard, Clément; Guichard, Frédéric

    2014-03-01

    We present in this paper a novel capacitive device that stimulates the touchscreen interface of a smartphone (or of any imaging device equipped with a capacitive touchscreen) and synchronizes triggering with the DxO LED Universal Timer to measure shooting time lag and shutter lag according to ISO 15781:2013. The device and protocol extend the time lag measurement beyond the standard by including negative shutter lag, a phenomenon that is more and more commonly found in smartphones. The device is computer-controlled, and this feature, combined with measurement algorithms, makes it possible to automatize a large series of captures so as to provide more refined statistical analyses when, for example, the shutter lag of "zero shutter lag" devices is limited by the frame time as our measurements confirm.

  2. Nemesis I: Parallel Enhancements to ExodusII

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hennigan, Gary L.; John, Matthew S.; Shadid, John N.

    2006-03-28

    NEMESIS I is an enhancement to the EXODUS II finite element database model used to store and retrieve data for unstructured parallel finite element analyses. NEMESIS I adds data structures which facilitate the partitioning of a scalar (standard serial) EXODUS II file onto parallel disk systems found on many parallel computers. Since the NEMESIS I application programming interface (APl)can be used to append information to an existing EXODUS II files can be used on files which contain NEMESIS I information. The NEMESIS I information is written and read via C or C++ callable functions which compromise the NEMESIS I API.

  3. Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods

    NASA Technical Reports Server (NTRS)

    Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.

    1990-01-01

    Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modem video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.

  4. Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods

    NASA Technical Reports Server (NTRS)

    Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.

    1990-01-01

    Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modern video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.

  5. A cost and utility analysis of NIM/CAMAC standards and equipment for shuttle payload data acquisition and control systems. Volume 2: Tasks 1 and 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A representative set of payloads for both science and applications disciplines were selected that would ensure a realistic and statistically significant estimate of equipment utilization. The selected payloads were analyzed to determine the applicability of Nuclear Instrumentation Modular (NIM)/Computer Automated Measurement Control (CAMAC) equipment in satisfying their data acquisition and control requirements. The analyses results were combined with the comparable results from related studies to arrive at an overall assessment of the applicability and commonality of NIM/CAMAC equipment usage across the spectrum of payloads.

  6. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    ERIC Educational Resources Information Center

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  7. Influence of wind-speed on short-duration NO2 measurements using Palmes and Ogawa passive diffusion samplers

    NASA Astrophysics Data System (ADS)

    Masey, Nicola; Gillespie, Jonathan; Heal, Mathew R.; Hamilton, Scott; Beverland, Iain J.

    2017-07-01

    We assessed the precision and accuracy of nitrogen dioxide (NO2) concentrations over 2-day, 3-day and 7-day exposure periods measured with the following types of passive diffusion samplers: standard (open) Palmes tubes; standard Ogawa samplers with commercially-prepared Ogawa absorbent pads (Ogawa[S]); and modified Ogawa samplers with absorbent-impregnated stainless steel meshes normally used in Palmes tubes (Ogawa[P]). We deployed these passive samplers close to the inlet of a chemiluminescence NO2 analyser at an urban background site in Glasgow, UK over 32 discrete measurement periods. Duplicate relative standard deviation was <7% for all passive samplers. The Ogawa[P], Ogawa[S] and Palmes samplers explained 93%, 87% and 58% of temporal variation in analyser concentrations respectively. Uptake rates for Palmes and Ogawa[S] samplers were positively and linearly associated with wind-speed (P < 0.01 and P < 0.05 respectively). Computation of adjusted uptake rates using average wind-speed observed during each sampling period increased the variation in analyser concentrations explained by Palmes and Ogawa[S] estimates to 90% and 92% respectively, suggesting that measurements can be corrected for shortening of diffusion path lengths due to wind-speed to improve the accuracy of estimates of short-term NO2 exposure. Monitoring situations where it is difficult to reliably estimate wind-speed variations, e.g. across multiple sites with different unknown exposures to local winds, and personal exposure monitoring, are likely to benefit from protection of these sampling devices from the effects of wind, for example by use of a mesh or membrane across the open end. The uptake rate of Ogawa[P] samplers was not associated with wind-speed resulting in a high correlation between estimated concentrations and observed analyser concentrations. The use of Palmes meshes in Ogawa[P] samplers reduced the cost of sampler preparation and removed uncertainty associated with the unknown manufacturing process for the commercially-prepared collection pads.

  8. Precision calculations for h → WW/ZZ → 4 fermions in a singlet extension of the Standard Model with P rophecy4 f

    NASA Astrophysics Data System (ADS)

    Altenkamp, Lukas; Boggia, Michele; Dittmaier, Stefan

    2018-04-01

    We consider an extension of the Standard Model by a real singlet scalar field with a ℤ2-symmetric Lagrangian and spontaneous symmetry breaking with vacuum expectation value for the singlet. Considering the lighter of the two scalars of the theory to be the 125 GeV Higgs particle, we parametrize the scalar sector by the mass of the heavy Higgs boson, a mixing angle α, and a scalar Higgs self-coupling λ 12. Taking into account theoretical constraints from perturbativity and vacuum stability, we compute next-to-leading-order electroweak and QCD corrections to the decays h → WW/ZZ → 4 fermions of the light Higgs boson for some scenarios proposed in the literature. We formulate two renormalization schemes and investigate the conversion of the input parameters between the schemes, finding sizeable effects. Solving the renormalization-group equations for the \\overline{MS} parameters α and λ 12, we observe a significantly reduced scale and scheme dependence in the next-to-leading-order results. For some scenarios suggested in the literature, the total decay width for the process h → 4 f is computed as a function of the mixing angle and compared to the width of a corresponding Standard Model Higgs boson, revealing deviations below 10%. Differential distributions do not show significant distortions by effects beyond the Standard Model. The calculations are implemented in the Monte Carlo generator P rophecy4 f, which is ready for applications in data analyses in the framework of the singlet extension.

  9. 3-D modeling of ductile tearing using finite elements: Computational aspects and techniques

    NASA Astrophysics Data System (ADS)

    Gullerud, Arne Stewart

    This research focuses on the development and application of computational tools to perform large-scale, 3-D modeling of ductile tearing in engineering components under quasi-static to mild loading rates. Two standard models for ductile tearing---the computational cell methodology and crack growth controlled by the crack tip opening angle (CTOA)---are described and their 3-D implementations are explored. For the computational cell methodology, quantification of the effects of several numerical issues---computational load step size, procedures for force release after cell deletion, and the porosity for cell deletion---enables construction of computational algorithms to remove the dependence of predicted crack growth on these issues. This work also describes two extensions of the CTOA approach into 3-D: a general 3-D method and a constant front technique. Analyses compare the characteristics of the extensions, and a validation study explores the ability of the constant front extension to predict crack growth in thin aluminum test specimens over a range of specimen geometries, absolutes sizes, and levels of out-of-plane constraint. To provide a computational framework suitable for the solution of these problems, this work also describes the parallel implementation of a nonlinear, implicit finite element code. The implementation employs an explicit message-passing approach using the MPI standard to maintain portability, a domain decomposition of element data to provide parallel execution, and a master-worker organization of the computational processes to enhance future extensibility. A linear preconditioned conjugate gradient (LPCG) solver serves as the core of the solution process. The parallel LPCG solver utilizes an element-by-element (EBE) structure of the computations to permit a dual-level decomposition of the element data: domain decomposition of the mesh provides efficient coarse-grain parallel execution, while decomposition of the domains into blocks of similar elements (same type, constitutive model, etc.) provides fine-grain parallel computation on each processor. A major focus of the LPCG solver is a new implementation of the Hughes-Winget element-by-element (HW) preconditioner. The implementation employs a weighted dependency graph combined with a new coloring algorithm to provide load-balanced scheduling for the preconditioner and overlapped communication/computation. This approach enables efficient parallel application of the HW preconditioner for arbitrary unstructured meshes.

  10. Clinical value of whole body fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography in the detection of metastatic bladder cancer.

    PubMed

    Yang, Zhongyi; Pan, Lingling; Cheng, Jingyi; Hu, Silong; Xu, Junyan; Ye, Dingwei; Zhang, Yingjian

    2012-07-01

    To investigate the value of whole-body fluorine-18 2-fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography for the detection of metastatic bladder cancer. From December 2006 to August 2010, 60 bladder cancer patients (median age 60.5 years old, range 32-96) underwent whole body positron emission tomography/computed tomography positron emission tomography/computed tomography. The diagnostic accuracy was assessed by performing both organ-based and patient-based analyses. Identified lesions were further studied by biopsy or clinically followed for at least 6 months. One hundred and thirty-four suspicious lesions were identified. Among them, 4 primary cancers (2 pancreatic cancers, 1 colonic and 1 nasopharyngeal cancer) were incidentally detected, and the patients could be treated on time. For the remaining 130 lesions, positron emission tomography/computed tomography detected 118 true positive lesions (sensitivity = 95.9%). On the patient-based analysis, the overall sensitivity and specificity resulted to be 87.1% and 89.7%, respectively. There was no difference of sensitivity and specificity in patients with or without adjuvant treatment in terms of detection of metastatic sites by positron emission tomography/computed tomography. Compared with conventional imaging modality, positron emission tomography/computed tomography correctly changed the management in 15 patients (25.0%). Positron emission tomography/computed tomography has excellent sensitivity and specificity in the detection of metastatic bladder cancer and it provides additional diagnostic information compared to standard imaging techniques. © 2012 The Japanese Urological Association.

  11. Standard terminology and labeling of ocular tissue for transplantation.

    PubMed

    Armitage, W John; Ashford, Paul; Crow, Barbara; Dahl, Patricia; DeMatteo, Jennifer; Distler, Pat; Gopinathan, Usha; Madden, Peter W; Mannis, Mark J; Moffatt, S Louise; Ponzin, Diego; Tan, Donald

    2013-06-01

    To develop an internationally agreed terminology for describing ocular tissue grafts to improve the accuracy and reliability of information transfer, to enhance tissue traceability, and to facilitate the gathering of comparative global activity data, including denominator data for use in biovigilance analyses. ICCBBA, the international standards organization for terminology, coding, and labeling of blood, cells, and tissues, approached the major Eye Bank Associations to form an expert advisory group. The group met by regular conference calls to develop a standard terminology, which was released for public consultation and amended accordingly. The terminology uses broad definitions (Classes) with modifying characteristics (Attributes) to define each ocular tissue product. The terminology may be used within the ISBT 128 system to label tissue products with standardized bar codes enabling the electronic capture of critical data in the collection, processing, and distribution of tissues. Guidance on coding and labeling has also been developed. The development of a standard terminology for ocular tissue marks an important step for improving traceability and reducing the risk of mistakes due to transcription errors. ISBT 128 computer codes have been assigned and may now be used to label ocular tissues. Eye banks are encouraged to adopt this standard terminology and move toward full implementation of ISBT 128 nomenclature, coding, and labeling.

  12. Multi-Strain Deterministic Chaos in Dengue Epidemiology, A Challenge for Computational Mathematics

    NASA Astrophysics Data System (ADS)

    Aguiar, Maíra; Kooi, Bob W.; Stollenwerk, Nico

    2009-09-01

    Recently, we have analysed epidemiological models of competing strains of pathogens and hence differences in transmission for first versus secondary infection due to interaction of the strains with previously aquired immunities, as has been described for dengue fever, known as antibody dependent enhancement (ADE). These models show a rich variety of dynamics through bifurcations up to deterministic chaos. Including temporary cross-immunity even enlarges the parameter range of such chaotic attractors, and also gives rise to various coexisting attractors, which are difficult to identify by standard numerical bifurcation programs using continuation methods. A combination of techniques, including classical bifurcation plots and Lyapunov exponent spectra has to be applied in comparison to get further insight into such dynamical structures. Especially, Lyapunov spectra, which quantify the predictability horizon in the epidemiological system, are computationally very demanding. We show ways to speed up computations of such Lyapunov spectra by a factor of more than ten by parallelizing previously used sequential C programs. Such fast computations of Lyapunov spectra will be especially of use in future investigations of seasonally forced versions of the present models, as they are needed for data analysis.

  13. Classical boson sampling algorithms with superior performance to near-term experiments

    NASA Astrophysics Data System (ADS)

    Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony

    2017-12-01

    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.

  14. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  15. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  16. Computer Science and Technology Publications. NBS Publications List 84.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  17. Fast gravitational wave radiometry using data folding

    NASA Astrophysics Data System (ADS)

    Ain, Anirban; Dalvi, Prathamesh; Mitra, Sanjit

    2015-07-01

    Gravitational waves (GWs) from the early universe and unresolved astrophysical sources are expected to create a stochastic GW background (SGWB). The GW radiometer algorithm is well suited to probe such a background using data from ground-based laser interferometric detectors. Radiometer analysis can be performed in different bases, e.g., isotropic, pixel or spherical harmonic. Each of these analyses possesses a common temporal symmetry which we exploit here to fold the whole data set for every detector pair, typically a few hundred to a thousand days of data, to only one sidereal day, without any compromise in precision. We develop the algebra and a software pipeline needed to fold data, accounting for the effect of overlapping windows and nonstationary noise. We implement this on LIGO's fifth science run data and validate it by performing a standard anisotropic SGWB search on both folded and unfolded data. Folded data not only leads to orders of magnitude reduction in computation cost, but it results in a conveniently small data volume of few gigabytes, making it possible to perform an actual analysis on a personal computer, as well as easy movement of data. A few important analyses, yet unaccomplished due to computational limitations, will now become feasible. Folded data, being independent of the radiometer basis, will also be useful in reducing processing redundancies in multiple searches and provide a common ground for mutual consistency checks. Most importantly, folded data will allow vast amount of experimentation with existing searches and provide substantial help in developing new strategies to find unknown sources.

  18. The Box Task: A tool to design experiments for assessing visuospatial working memory.

    PubMed

    Kessels, Roy P C; Postma, Albert

    2017-09-15

    The present paper describes the Box Task, a paradigm for the computerized assessment of visuospatial working memory. In this task, hidden objects have to be searched by opening closed boxes that are shown at different locations on the computer screen. The set size (i.e., number of boxes that must be searched) can be varied and different error scores can be computed that measure specific working memory processes (i.e., the number of within-search and between-search errors). The Box Task also has a developer's mode in which new stimulus displays can be designed for use in tailored experiments. The Box Task comes with a standard set of stimulus displays (including practice trials, as well as stimulus displays with 4, 6, and 8 boxes). The raw data can be analyzed easily and the results of individual participants can be aggregated into one spreadsheet for further statistical analyses.

  19. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  20. Rapid protein alignment in the cloud: HAMOND combines fast DIAMOND alignments with Hadoop parallelism.

    PubMed

    Yu, Jia; Blom, Jochen; Sczyrba, Alexander; Goesmann, Alexander

    2017-09-10

    The introduction of next generation sequencing has caused a steady increase in the amounts of data that have to be processed in modern life science. Sequence alignment plays a key role in the analysis of sequencing data e.g. within whole genome sequencing or metagenome projects. BLAST is a commonly used alignment tool that was the standard approach for more than two decades, but in the last years faster alternatives have been proposed including RapSearch, GHOSTX, and DIAMOND. Here we introduce HAMOND, an application that uses Apache Hadoop to parallelize DIAMOND computation in order to scale-out the calculation of alignments. HAMOND is fault tolerant and scalable by utilizing large cloud computing infrastructures like Amazon Web Services. HAMOND has been tested in comparative genomics analyses and showed promising results both in efficiency and accuracy. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  1. Prediction of elemental creep. [steady state and cyclic data from regression analysis

    NASA Technical Reports Server (NTRS)

    Davis, J. W.; Rummler, D. R.

    1975-01-01

    Cyclic and steady-state creep tests were performed to provide data which were used to develop predictive equations. These equations, describing creep as a function of stress, temperature, and time, were developed through the use of a least squares regression analyses computer program for both the steady-state and cyclic data sets. Comparison of the data from the two types of tests, revealed that there was no significant difference between the cyclic and steady-state creep strains for the L-605 sheet under the experimental conditions investigated (for the same total time at load). Attempts to develop a single linear equation describing the combined steady-state and cyclic creep data resulted in standard errors of estimates higher than obtained for the individual data sets. A proposed approach to predict elemental creep in metals uses the cyclic creep equation and a computer program which applies strain and time hardening theories of creep accumulation.

  2. Effective teaching strategies and methods of delivery for patient education: a systematic review and practice guideline recommendations.

    PubMed

    Friedman, Audrey Jusko; Cosby, Roxanne; Boyko, Susan; Hatton-Bauer, Jane; Turnbull, Gale

    2011-03-01

    The objective of this study was to determine effective teaching strategies and methods of delivery for patient education (PE). A systematic review was conducted and reviews with or without meta-analyses, which examined teaching strategies and methods of delivery for PE, were included. Teaching strategies identified are traditional lectures, discussions, simulated games, computer technology, written material, audiovisual sources, verbal recall, demonstration, and role playing. Methods of delivery focused on how to deliver the teaching strategies. Teaching strategies that increased knowledge, decreased anxiety, and increased satisfaction included computer technology, audio and videotapes, written materials, and demonstrations. Various teaching strategies used in combination were similarly successful. Moreover, structured-, culturally appropriate- and patient-specific teachings were found to be better than ad hoc teaching or generalized teaching. Findings provide guidance for establishing provincial standards for the delivery of PE. Recommendations concerning the efficacy of the teaching strategies and delivery methods are provided.

  3. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    PubMed

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  4. Air slab-correction for Γ-ray attenuation measurements

    NASA Astrophysics Data System (ADS)

    Mann, Kulwinder Singh

    2017-12-01

    Gamma (γ)-ray shielding behaviour (GSB) of a material can be ascertained from its linear attenuation coefficient (μ, cm-1). Narrow-beam transmission geometry is required for μ-measurement. In such measurements, a thin slab of the material has to insert between point-isotropic γ-ray source and detector assembly. The accuracy in measurements requires that sample's optical thickness (OT) remain below 0.5 mean free path (mfp). Sometimes it is very difficult to produce thin slab of sample (absorber), on the other hand for thick absorber, i.e. OT >0.5 mfp, the influence of the air displaced by it cannot be ignored during μ-measurements. Thus, for a thick sample, correction factor has been suggested which compensates the air present in the transmission geometry. The correction factor has been named as an air slab-correction (ASC). Six samples of low-Z engineering materials (cement-black, clay, red-mud, lime-stone, cement-white and plaster-of-paris) have been selected for investigating the effect of ASC on μ-measurements at three γ-ray energies (661.66, 1173.24, 1332.50 keV). The measurements have been made using point-isotropic γ-ray sources (Cs-137 and Co-60), NaI(Tl) detector and multi-channel-analyser coupled with a personal computer. Theoretical values of μ have been computed using a GRIC2-toolkit (standardized computer programme). Elemental compositions of the samples were measured with Wavelength Dispersive X-ray Fluorescence (WDXRF) analyser. Inter-comparison of measured and computed μ-values, suggested that the application of ASC helps in precise μ-measurement for thick samples of low-Z materials. Thus, this hitherto widely ignored ASC factor is recommended to use in similar γ-ray measurements.

  5. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  6. Compliance with Standard Precautions and Associated Factors among Healthcare Workers in Gondar University Comprehensive Specialized Hospital, Northwest Ethiopia

    PubMed Central

    Haile, Tariku Gebre

    2017-01-01

    Background. In many studies, compliance with standard precautions among healthcare workers was reported to be inadequate. Objective. The aim of this study was to assess compliance with standard precautions and associated factors among healthcare workers in northwest Ethiopia. Methods. An institution-based cross-sectional study was conducted from March 01 to April 30, 2014. Simple random sampling technique was used to select participants. Data were entered into Epi info 3.5.1 and were exported to SPSS version 20.0 for statistical analysis. Multivariate logistic regression analyses were computed and adjusted odds ratio with 95% confidence interval was calculated to identify associated factors. Results. The proportion of healthcare workers who always comply with standard precautions was found to be 12%. Being a female healthcare worker (AOR [95% CI] 2.18 [1.12–4.23]), higher infection risk perception (AOR [95% CI] 3.46 [1.67–7.18]), training on standard precautions (AOR [95% CI] 2.90 [1.20–7.02]), accessibility of personal protective equipment (AOR [95% CI] 2.87 [1.41–5.86]), and management support (AOR [95% CI] 2.23 [1.11–4.53]) were found to be statistically significant. Conclusion and Recommendation. Compliance with standard precautions among the healthcare workers is very low. Interventions which include training of healthcare workers on standard precautions and consistent management support are recommended. PMID:28191020

  7. Are compression garments effective for the recovery of exercise-induced muscle damage? A systematic review with meta-analysis.

    PubMed

    Marqués-Jiménez, Diego; Calleja-González, Julio; Arratibel, Iñaki; Delextrat, Anne; Terrados, Nicolás

    2016-01-01

    The aim was to identify benefits of compression garments used for recovery of exercised-induced muscle damage. Computer-based literature research was performed in September 2015 using four online databases: Medline (PubMed), Cochrane, WOS (Web Of Science) and Scopus. The analysis of risk of bias was completed in accordance with the Cochrane Collaboration Guidelines. Mean differences and 95% confidence intervals were calculated with Hedges' g for continuous outcomes. A random effect meta-analysis model was used. Systematic differences (heterogeneity) were assessed with I(2) statistic. Most results obtained had high heterogeneity, thus their interpretation should be careful. Our findings showed that creatine kinase (standard mean difference=-0.02, 9 studies) was unaffected when using compression garments for recovery purposes. In contrast, blood lactate concentration was increased (standard mean difference=0.98, 5 studies). Applying compression reduced lactate dehydrogenase (standard mean difference=-0.52, 2 studies), muscle swelling (standard mean difference=-0.73, 5 studies) and perceptual measurements (standard mean difference=-0.43, 15 studies). Analyses of power (standard mean difference=1.63, 5 studies) and strength (standard mean difference=1.18, 8 studies) indicate faster recovery of muscle function after exercise. These results suggest that the application of compression clothing may aid in the recovery of exercise induced muscle damage, although the findings need corroboration. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Status of emerging standards for removable computer storage media and related contributions of NIST

    NASA Technical Reports Server (NTRS)

    Podio, Fernando L.

    1992-01-01

    Standards for removable computer storage media are needed so that users may reliably interchange data both within and among various computer installations. Furthermore, media interchange standards support competition in industry and prevent sole-source lock-in. NIST participates in magnetic tape and optical disk standards development through Technical Committees X3B5, Digital Magnetic Tapes, X3B11, Optical Digital Data Disk, and the Joint Technical Commission on Data Permanence. NIST also participates in other relevant national and international standards committees for removable computer storage media. Industry standards for digital magnetic tapes require the use of Standard Reference Materials (SRM's) developed and maintained by NIST. In addition, NIST has been studying care and handling procedures required for digital magnetic tapes. NIST has developed a methodology for determining the life expectancy of optical disks. NIST is developing care and handling procedures for optical digital data disks and is involved in a program to investigate error reporting capabilities of optical disk drives. This presentation reflects the status of emerging magnetic tape and optical disk standards, as well as NIST's contributions in support of these standards.

  9. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  10. Atmospheric effects on cluster analyses. [for remote sensing application

    NASA Technical Reports Server (NTRS)

    Kiang, R. K.

    1979-01-01

    Ground reflected radiance, from which information is extracted through techniques of cluster analyses for remote sensing application, is altered by the atmosphere when it reaches the satellite. Therefore it is essential to understand the effects of the atmosphere on Landsat measurements, cluster characteristics and analysis accuracy. A doubling model is employed to compute the effective reflectivity, observed from the satellite, as a function of ground reflectivity, solar zenith angle and aerosol optical thickness for standard atmosphere. The relation between the effective reflectivity and ground reflectivity is approximately linear. It is shown that for a horizontally homogeneous atmosphere, the classification statistics from a maximum likelihood classifier remains unchanged under these transforms. If inhomogeneity is present, the divergence between clusters is reduced, and correlation between spectral bands increases. Radiance reflected by the background area surrounding the target may also reach the satellite. The influence of background reflectivity on effective reflectivity is discussed.

  11. Recent Progress on Labfit: a Multispectrum Analysis Program for Fitting Lineshapes Including the Htp Model and Temperature Dependence

    NASA Astrophysics Data System (ADS)

    Cich, Matthew J.; Guillaume, Alexandre; Drouin, Brian; Benner, D. Chris

    2017-06-01

    Multispectrum analysis can be a challenge for a variety of reasons. It can be computationally intensive to fit a proper line shape model especially for high resolution experimental data. Band-wide analyses including many transitions along with interactions, across many pressures and temperatures are essential to accurately model, for example, atmospherically relevant systems. Labfit is a fast multispectrum analysis program originally developed by D. Chris Benner with a text-based interface. More recently at JPL a graphical user interface was developed with the goal of increasing the ease of use but also the number of potential users. The HTP lineshape model has been added to Labfit keeping it up-to-date with community standards. Recent analyses using labfit will be shown to demonstrate its ability to competently handle large experimental datasets, including high order lineshape effects, that are otherwise unmanageable.

  12. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  13. Rapid and Robust Cross-Correlation-Based Seismic Phase Identification Using an Approximate Nearest Neighbor Method

    NASA Astrophysics Data System (ADS)

    Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.

    2016-12-01

    The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.

  14. One laptop per child, local refurbishment or overseas donations? Sustainability assessment of computer supply scenarios for schools in Colombia.

    PubMed

    Streicher-Porte, Martin; Marthaler, Christian; Böni, Heinz; Schluep, Mathias; Camacho, Angel; Hilty, Lorenz M

    2009-08-01

    With the intention of bridging the 'digital divide' many programmes have been launched to provide computers for educational institutions, ranging from refurbishing second hand computers to delivering low cost new computers. The fast and economical provision of large quantities of equipment is one of the many challenges faced by such programmes. If an increase is to be achieved in the sustainability of computer supplies for schools, not only must equipment be provided, but also suitable training and maintenance delivered. Furthermore, appropriate recycling has to be ensured, so that end-of-life equipment can be dealt with properly. This study has evaluated the suitability of three computer supply scenarios to schools in Colombia: (i) 'Colombian refurbishment', -refurbishment of computers donated in Colombia, (ii) 'Overseas refurbishment', -import of computers which were donated and refurbished abroad, and (iii) 'XO Laptop', -purchase of low cost computers manufactured in Korea. The methods applied were: Material Flow Assessment, -to assess the quantities-, Life Cycle Assessment, -to assess the environmental impacts, and the application of the Multiple Attribute Utility Theory, -to analyse, evaluate and compare different scenarios. The most sustainable solution proved to be the local refurbishment of second hand computers of Colombian origin to an appropriate technical standard. The environmental impacts of such practices need to be evaluated carefully, as second hand appliances have to be maintained, require spare parts and sometimes use more energy than newer equipment. Providing schools with second hand computers from overseas and through programmes such as 'One Laptop Per Child' has the disadvantage that the potential for social improvements - such as creation of jobs and local industry involvement - is very low.

  15. Ultrafast Comparison of Personal Genomes via Precomputed Genome Fingerprints.

    PubMed

    Glusman, Gustavo; Mauldin, Denise E; Hood, Leroy E; Robinson, Max

    2017-01-01

    We present an ultrafast method for comparing personal genomes. We transform the standard genome representation (lists of variants relative to a reference) into "genome fingerprints" via locality sensitive hashing. The resulting genome fingerprints can be meaningfully compared even when the input data were obtained using different sequencing technologies, processed using different pipelines, represented in different data formats and relative to different reference versions. Furthermore, genome fingerprints are robust to up to 30% missing data. Because of their reduced size, computation on the genome fingerprints is fast and requires little memory. For example, we could compute all-against-all pairwise comparisons among the 2504 genomes in the 1000 Genomes data set in 67 s at high quality (21 μs per comparison, on a single processor), and achieved a lower quality approximation in just 11 s. Efficient computation enables scaling up a variety of important genome analyses, including quantifying relatedness, recognizing duplicative sequenced genomes in a set, population reconstruction, and many others. The original genome representation cannot be reconstructed from its fingerprint, effectively decoupling genome comparison from genome interpretation; the method thus has significant implications for privacy-preserving genome analytics.

  16. Fracture mechanics life analytical methods verification testing

    NASA Technical Reports Server (NTRS)

    Favenesi, J. A.; Clemmons, T. G.; Lambert, T. J.

    1994-01-01

    Verification and validation of the basic information capabilities in NASCRAC has been completed. The basic information includes computation of K versus a, J versus a, and crack opening area versus a. These quantities represent building blocks which NASCRAC uses in its other computations such as fatigue crack life and tearing instability. Several methods were used to verify and validate the basic information capabilities. The simple configurations such as the compact tension specimen and a crack in a finite plate were verified and validated versus handbook solutions for simple loads. For general loads using weight functions, offline integration using standard FORTRAN routines was performed. For more complicated configurations such as corner cracks and semielliptical cracks, NASCRAC solutions were verified and validated versus published results and finite element analyses. A few minor problems were identified in the basic information capabilities of the simple configurations. In the more complicated configurations, significant differences between NASCRAC and reference solutions were observed because NASCRAC calculates its solutions as averaged values across the entire crack front whereas the reference solutions were computed for a single point.

  17. Computer aided manual validation of mass spectrometry-based proteomic data.

    PubMed

    Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M

    2013-06-15

    Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Guidelines and standard procedures for continuous water-quality monitors: Station operation, record computation, and data reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Boulger, Robert W.; Oblinger, Carolyn J.; Smith, Brett A.

    2006-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess the quality of the Nation's surface water. A common monitoring-system configuration for water-quality data collection is the four-parameter monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data. Such systems also can be configured to measure other properties, such as turbidity or fluorescence. Data from sensors can be used in conjunction with chemical analyses of samples to estimate chemical loads. The sensors that are used to measure water-quality field parameters require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. This report provides guidelines for site- and monitor-selection considerations; sensor inspection and calibration methods; field procedures; data evaluation, correction, and computation; and record-review and data-reporting processes, which supersede the guidelines presented previously in U.S. Geological Survey Water-Resources Investigations Report WRIR 00-4252. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  19. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  20. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  1. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  2. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  3. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  4. Gold standards and expert panels: a pulmonary nodule case study with challenges and solutions

    NASA Astrophysics Data System (ADS)

    Miller, Dave P.; O'Shaughnessy, Kathryn F.; Wood, Susan A.; Castellino, Ronald A.

    2004-05-01

    Comparative evaluations of reader performance using different modalities, e.g. CT with computer-aided detection (CAD) vs. CT without CAD, generally require a "truth" definition based on a gold standard. There are many situations in which a true invariant gold standard is impractical or impossible to obtain. For instance, small pulmonary nodules are generally not assessed by biopsy or resection. In such cases, it is common to use a unanimous consensus or majority agreement from an expert panel as a reference standard for actionability in lieu of the unknown gold standard for disease. Nonetheless, there are three major concerns about expert panel reference standards: (1) actionability is not synonymous with disease (2) it may be possible to obtain different conclusions about which modality is better using different rules (e.g. majority vs. unanimous consensus), and (3) the variability associated with the panelists is not formally captured in the p-values or confidence intervals that are generally produced for estimating the extent to which one modality is superior to the other. A multi-reader-multi-case (MRMC) receiver operating characteristic (ROC) study was performed using 90 cases, 15 readers, and a reference truth based on 3 experienced panelists. The primary analyses were conducted using a reference truth of unanimous consensus regarding actionability (3 out of 3 panelists). To assess the three concerns noted above: (1) additional data from the original radiology reports were compared to the panel (2) the complete analysis was repeated using different definitions of truth, and (3) bootstrap analyses were conducted in which new truth panels were constructed by picking 1, 2, or 3 panelists at random. The definition of the reference truth affected the results for each modality (CT with CAD and CT without CAD) considered by itself, but the effects were similar, so the primary analysis comparing the modalities was robust to the choice of the reference truth.

  5. Results of the CCRI(II)-S12.H-3 supplementary comparison: Comparison of methods for the calculation of the activity and standard uncertainty of a tritiated-water source measured using the LSC-TDCR method.

    PubMed

    Cassette, Philippe; Altzitzoglou, Timotheos; Antohe, Andrei; Rossi, Mario; Arinc, Arzu; Capogni, Marco; Galea, Raphael; Gudelis, Arunas; Kossert, Karsten; Lee, K B; Liang, Juncheng; Nedjadi, Youcef; Oropesa Verdecia, Pilar; Shilnikova, Tanya; van Wyngaardt, Winifred; Ziemek, Tomasz; Zimmerman, Brian

    2018-04-01

    A comparison of calculations of the activity of a 3 H 2 O liquid scintillation source using the same experimental data set collected at the LNE-LNHB with a triple-to-double coincidence ratio (TDCR) counter was completed. A total of 17 laboratories calculated the activity and standard uncertainty of the LS source using the files with experimental data provided by the LNE-LNHB. The results as well as relevant information on the computation techniques are presented and analysed in this paper. All results are compatible, even if there is a significant dispersion between the reported uncertainties. An output of this comparison is the estimation of the dispersion of TDCR measurement results when measurement conditions are well defined. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  7. Objective Data Assessment (ODA) Methods as Nutritional Assessment Tools.

    PubMed

    Hamada, Yasuhiro

    2015-01-01

    Nutritional screening and assessment should be a standard of care for all patients because nutritional management plays an important role in clinical practice. However, there is no gold standard for the diagnosis of malnutrition or undernutrition, although a large number of nutritional screening and assessment tools have been developed. Nutritional screening and assessment tools are classified into two categories, namely, subjective global assessment (SGA) and objective data assessment (ODA). SGA assesses nutritional status based on the features of medical history and physical examination. On the other hand, ODA consists of objective data provided from various analyses, such as anthropometry, bioimpedance analysis (BIA), dual-energy X-ray absorptiometry (DEXA), computed tomography (CT), magnetic resonance imaging (MRI), laboratory tests, and functional tests. This review highlights knowledge on the performance of ODA methods for the assessment of nutritional status in clinical practice. J. Med. Invest. 62: 119-122, August, 2015.

  8. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    USGS Publications Warehouse

    Southard, Rodney E.

    2013-01-01

    The weather and precipitation patterns in Missouri vary considerably from year to year. In 2008, the statewide average rainfall was 57.34 inches and in 2012, the statewide average rainfall was 30.64 inches. This variability in precipitation and resulting streamflow in Missouri underlies the necessity for water managers and users to have reliable streamflow statistics and a means to compute select statistics at ungaged locations for a better understanding of water availability. Knowledge of surface-water availability is dependent on the streamflow data that have been collected and analyzed by the U.S. Geological Survey for more than 100 years at approximately 350 streamgages throughout Missouri. The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, computed streamflow statistics at streamgages through the 2010 water year, defined periods of drought and defined methods to estimate streamflow statistics at ungaged locations, and developed regional regression equations to compute selected streamflow statistics at ungaged locations. Streamflow statistics and flow durations were computed for 532 streamgages in Missouri and in neighboring States of Missouri. For streamgages with more than 10 years of record, Kendall’s tau was computed to evaluate for trends in streamflow data. If trends were detected, the variable length method was used to define the period of no trend. Water years were removed from the dataset from the beginning of the record for a streamgage until no trend was detected. Low-flow frequency statistics were then computed for the entire period of record and for the period of no trend if 10 or more years of record were available for each analysis. Three methods are presented for computing selected streamflow statistics at ungaged locations. The first method uses power curve equations developed for 28 selected streams in Missouri and neighboring States that have multiple streamgages on the same streams. Statistical estimates on one of these streams can be calculated at an ungaged location that has a drainage area that is between 40 percent of the drainage area of the farthest upstream streamgage and within 150 percent of the drainage area of the farthest downstream streamgage along the stream of interest. The second method may be used on any stream with a streamgage that has operated for 10 years or longer and for which anthropogenic effects have not changed the low-flow characteristics at the ungaged location since collection of the streamflow data. A ratio of drainage area of the stream at the ungaged location to the drainage area of the stream at the streamgage was computed to estimate the statistic at the ungaged location. The range of applicability is between 40- and 150-percent of the drainage area of the streamgage, and the ungaged location must be located on the same stream as the streamgage. The third method uses regional regression equations to estimate selected low-flow frequency statistics for unregulated streams in Missouri. This report presents regression equations to estimate frequency statistics for the 10-year recurrence interval and for the N-day durations of 1, 2, 3, 7, 10, 30, and 60 days. Basin and climatic characteristics were computed using geographic information system software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses based on existing digital geospatial data and previous studies. Spatial analyses for geographical bias in the predictive accuracy of the regional regression equations defined three low-flow regions with the State representing the three major physiographic provinces in Missouri. Region 1 includes the Central Lowlands, Region 2 includes the Ozark Plateaus, and Region 3 includes the Mississippi Alluvial Plain. A total of 207 streamgages were used in the regression analyses for the regional equations. Of the 207 U.S. Geological Survey streamgages, 77 were located in Region 1, 120 were located in Region 2, and 10 were located in Region 3. Streamgages located outside of Missouri were selected to extend the range of data used for the independent variables in the regression analyses. Streamgages included in the regression analyses had 10 or more years of record and were considered to be affected minimally by anthropogenic activities or trends. Regional regression analyses identified three characteristics as statistically significant for the development of regional equations. For Region 1, drainage area, longest flow path, and streamflow-variability index were statistically significant. The range in the standard error of estimate for Region 1 is 79.6 to 94.2 percent. For Region 2, drainage area and streamflow variability index were statistically significant, and the range in the standard error of estimate is 48.2 to 72.1 percent. For Region 3, drainage area and streamflow-variability index also were statistically significant with a range in the standard error of estimate of 48.1 to 96.2 percent. Limitations on the use of estimating low-flow frequency statistics at ungaged locations are dependent on the method used. The first method outlined for use in Missouri, power curve equations, were developed to estimate the selected statistics for ungaged locations on 28 selected streams with multiple streamgages located on the same stream. A second method uses a drainage-area ratio to compute statistics at an ungaged location using data from a single streamgage on the same stream with 10 or more years of record. Ungaged locations on these streams may use the ratio of the drainage area at an ungaged location to the drainage area at a streamgage location to scale the selected statistic value from the streamgage location to the ungaged location. This method can be used if the drainage area of the ungaged location is within 40 to 150 percent of the streamgage drainage area. The third method is the use of the regional regression equations. The limits for the use of these equations are based on the ranges of the characteristics used as independent variables and that streams must be affected minimally by anthropogenic activities.

  9. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  10. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  11. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  12. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  13. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  14. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    PubMed

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  15. Radiographical measurements for distal intra-articular fractures of the radius using plain radiographs and cone beam computed tomography images.

    PubMed

    Suojärvi, Nora; Sillat, T; Lindfors, N; Koskinen, S K

    2015-12-01

    Operative treatment of an intra-articular distal radius fracture is one of the most common procedures in orthopedic and hand surgery. The intra- and interobserver agreement of common radiographical measurements of these fractures using cone beam computed tomography (CBCT) and plain radiographs were evaluated. Thirty-seven patients undergoing open reduction and volar fixation for a distal radius fracture were studied. Two radiologists analyzed the preoperative radiographs and CBCT images. Agreement of the measurements was subjected to intra-class correlation coefficient and the Bland-Altman analyses. Plain radiographs provided a slightly poorer level of agreement. For fracture diastasis, excellent intraobserver agreement was achieved for radiographs and good or excellent agreement for CBCT, compared to poor interobserver agreement (ICC 0.334) for radiographs and good interobserver agreement (ICC 0.621) for CBCT images. The Bland-Altman analyses indicated a small mean difference between the measurements but rather large variation using both imaging methods, especially in angular measurements. For most of the measurements, radiographs do well, and may be used in clinical practice. Two different measurements by the same reader or by two different readers can lead to different decisions, and therefore a standardization of the measurements is imperative. More detailed analysis of articular surface needs cross-sectional imaging modalities.

  16. Quality Analysis on 3d Buidling Models Reconstructed from Uav Imagery

    NASA Astrophysics Data System (ADS)

    Jarzabek-Rychard, M.; Karpina, M.

    2016-06-01

    Recent developments in UAV technology and structure from motion techniques have effected that UAVs are becoming standard platforms for 3D data collection. Because of their flexibility and ability to reach inaccessible urban parts, drones appear as optimal solution for urban applications. Building reconstruction from the data collected with UAV has the important potential to reduce labour cost for fast update of already reconstructed 3D cities. However, especially for updating of existing scenes derived from different sensors (e.g. airborne laser scanning), a proper quality assessment is necessary. The objective of this paper is thus to evaluate the potential of UAV imagery as an information source for automatic 3D building modeling at LOD2. The investigation process is conducted threefold: (1) comparing generated SfM point cloud to ALS data; (2) computing internal consistency measures of the reconstruction process; (3) analysing the deviation of Check Points identified on building roofs and measured with a tacheometer. In order to gain deep insight in the modeling performance, various quality indicators are computed and analysed. The assessment performed according to the ground truth shows that the building models acquired with UAV-photogrammetry have the accuracy of less than 18 cm for the plannimetric position and about 15 cm for the height component.

  17. Estimates of Flow Duration, Mean Flow, and Peak-Discharge Frequency Values for Kansas Stream Locations

    USGS Publications Warehouse

    Perry, Charles A.; Wolock, David M.; Artman, Joshua C.

    2004-01-01

    Streamflow statistics of flow duration and peak-discharge frequency were estimated for 4,771 individual locations on streams listed on the 1999 Kansas Surface Water Register. These statistics included the flow-duration values of 90, 75, 50, 25, and 10 percent, as well as the mean flow value. Peak-discharge frequency values were estimated for the 2-, 5-, 10-, 25-, 50-, and 100-year floods. Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating flow-duration values of 90, 75, 50, 25, and 10 percent and the mean flow for uncontrolled flow stream locations. The contributing-drainage areas of 149 U.S. Geological Survey streamflow-gaging stations in Kansas and parts of surrounding States that had flow uncontrolled by Federal reservoirs and used in the regression analyses ranged from 2.06 to 12,004 square miles. Logarithmic transformations of climatic and basin data were performed to yield the best linear relation for developing equations to compute flow durations and mean flow. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were contributing-drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. The analyses yielded a model standard error of prediction range of 0.43 logarithmic units for the 90-percent duration analysis to 0.15 logarithmic units for the 10-percent duration analysis. The model standard error of prediction was 0.14 logarithmic units for the mean flow. Regression equations used to estimate peak-discharge frequency values were obtained from a previous report, and estimates for the 2-, 5-, 10-, 25-, 50-, and 100-year floods were determined for this report. The regression equations and an interpolation procedure were used to compute flow durations, mean flow, and estimates of peak-discharge frequency for locations along uncontrolled flow streams on the 1999 Kansas Surface Water Register. Flow durations, mean flow, and peak-discharge frequency values determined at available gaging stations were used to interpolate the regression-estimated flows for the stream locations where available. Streamflow statistics for locations that had uncontrolled flow were interpolated using data from gaging stations weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled reaches of Kansas streams, the streamflow statistics were interpolated between gaging stations using only gaged data weighted by drainage area.

  18. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  19. Space crew radiation exposure analysis system based on a commercial stand-alone CAD system

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.

    1992-01-01

    Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.

  20. Using FUN3D for Aeroelastic, Sonic Boom, and AeroPropulsoServoElastic (APSE) Analyses of a Supersonic Configuration

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph; Kopasakis, George

    2016-01-01

    An overview of recent applications of the FUN3D CFD code to computational aeroelastic, sonic boom, and aeropropulsoservoelasticity (APSE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed including multiple unstructured CFD grids suitable for aeroelastic and sonic boom analyses. In addition, aeroelastic Reduced-Order Models (ROMs) are generated and used to rapidly compute the aeroelastic response and utter boundaries at multiple flight conditions.

  1. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  2. Towards a Framework for Developing Semantic Relatedness Reference Standards

    PubMed Central

    Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.

    2010-01-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697

  3. 21 CFR 1311.08 - Incorporation by reference.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of Standards and Technology, Computer Security Division, Information Technology Laboratory, National... standards are available from the National Institute of Standards and Technology, Computer Security Division... 140-2, Security Requirements for Cryptographic Modules, May 25, 2001, as amended by Change Notices 2...

  4. Estimates of Median Flows for Streams on the 1999 Kansas Surface Water Register

    USGS Publications Warehouse

    Perry, Charles A.; Wolock, David M.; Artman, Joshua C.

    2004-01-01

    The Kansas State Legislature, by enacting Kansas Statute KSA 82a?2001 et. seq., mandated the criteria for determining which Kansas stream segments would be subject to classification by the State. One criterion for the selection as a classified stream segment is based on the statistic of median flow being equal to or greater than 1 cubic foot per second. As specified by KSA 82a?2001 et. seq., median flows were determined from U.S. Geological Survey streamflow-gaging-station data by using the most-recent 10 years of gaged data (KSA) for each streamflow-gaging station. Median flows also were determined by using gaged data from the entire period of record (all-available hydrology, AAH). Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating median flows for uncontrolled stream segments. The drainage area of the gaging stations on uncontrolled stream segments used in the regression analyses ranged from 2.06 to 12,004 square miles. A logarithmic transformation of the data was needed to develop the best linear relation for computing median flows. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. Tobit analyses of KSA data yielded a model standard error of prediction of 0.285 logarithmic units, and the best equations using Tobit analyses of AAH data had a model standard error of prediction of 0.250 logarithmic units. These regression equations and an interpolation procedure were used to compute median flows for the uncontrolled stream segments on the 1999 Kansas Surface Water Register. Measured median flows from gaging stations were incorporated into the regression-estimated median flows along the stream segments where available. The segments that were uncontrolled were interpolated using gaged data weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled segments of Kansas streams, the median flow information was interpolated between gaging stations using only gaged data weighted by drainage area. Of the 2,232 total stream segments on the Kansas Surface Water Register, 34.5 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second when the KSA analysis was used. When the AAH analysis was used, 36.2 percent of the segments had an estimated median streamflow of less than 1 cubic foot per second. This report supercedes U.S. Geological Survey Water-Resources Investigations Report 02?4292.

  5. The feasibility of using UML to compare the impact of different brands of computer system on the clinical consultation.

    PubMed

    Kumarapeli, Pushpa; de Lusignan, Simon; Koczan, Phil; Jones, Beryl; Sheeler, Ian

    2007-01-01

    UK general practice is universally computerised, with computers used in the consulting room at the point of care. Practices use a range of different brands of computer system, which have developed organically to meet the needs of general practitioners and health service managers. Unified Modelling Language (UML) is a standard modelling and specification notation widely used in software engineering. To examine the feasibility of UML notation to compare the impact of different brands of general practice computer system on the clinical consultation. Multi-channel video recordings of simulated consultation sessions were recorded on three different clinical computer systems in common use (EMIS, iSOFT Synergy and IPS Vision). User action recorder software recorded time logs of keyboard and mouse use, and pattern recognition software captured non-verbal communication. The outputs of these were used to create UML class and sequence diagrams for each consultation. We compared 'definition of the presenting problem' and 'prescribing', as these tasks were present in all the consultations analysed. Class diagrams identified the entities involved in the clinical consultation. Sequence diagrams identified common elements of the consultation (such as prescribing) and enabled comparisons to be made between the different brands of computer system. The clinician and computer system interaction varied greatly between the different brands. UML sequence diagrams are useful in identifying common tasks in the clinical consultation, and for contrasting the impact of the different brands of computer system on the clinical consultation. Further research is needed to see if patterns demonstrated in this pilot study are consistently displayed.

  6. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    2016-09-01

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace such that the dimensionality of the problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2-D and a random hydraulic conductivity field in 3-D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ˜101 to ˜102 in a multicore computational environment. Therefore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate to large-scale problems.

  7. Analysing the performance of personal computers based on Intel microprocessors for sequence aligning bioinformatics applications.

    PubMed

    Nair, Pradeep S; John, Eugene B

    2007-01-01

    Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.

  8. Stability and change in screen-based sedentary behaviours and associated factors among Norwegian children in the transition between childhood and adolescence.

    PubMed

    Gebremariam, Mekdes K; Totland, Torunn H; Andersen, Lene F; Bergh, Ingunn H; Bjelland, Mona; Grydeland, May; Ommundsen, Yngvar; Lien, Nanna

    2012-02-06

    In order to inform interventions to prevent sedentariness, more longitudinal studies are needed focusing on stability and change over time in multiple sedentary behaviours. This paper investigates patterns of stability and change in TV/DVD use, computer/electronic game use and total screen time (TST) and factors associated with these patterns among Norwegian children in the transition between childhood and adolescence. The baseline of this longitudinal study took place in September 2007 and included 975 students from 25 control schools of an intervention study, the HEalth In Adolescents (HEIA) study. The first follow-up took place in May 2008 and the second follow-up in May 2009, with 885 students participating at all time points (average age at baseline = 11.2, standard deviation ± 0.3). Time used for/spent on TV/DVD and computer/electronic games was self-reported, and a TST variable (hours/week) was computed. Tracking analyses based on absolute and rank measures, as well as regression analyses to assess factors associated with change in TST and with tracking high TST were conducted. Time spent on all sedentary behaviours investigated increased in both genders. Findings based on absolute and rank measures revealed a fair to moderate level of tracking over the 2 year period. High parental education was inversely related to an increase in TST among females. In males, self-efficacy related to barriers to physical activity and living with married or cohabitating parents were inversely related to an increase in TST. Factors associated with tracking high vs. low TST in the multinomial regression analyses were low self-efficacy and being of an ethnic minority background among females, and low self-efficacy, being overweight/obese and not living with married or cohabitating parents among males. Use of TV/DVD and computer/electronic games increased with age and tracked over time in this group of 11-13 year old Norwegian children. Interventions targeting these sedentary behaviours should thus be introduced early. The identified modifiable and non-modifiable factors associated with change in TST and tracking of high TST should be taken into consideration when planning such interventions.

  9. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  10. Multi-Scale Modeling to Improve Single-Molecule, Single-Cell Experiments

    NASA Astrophysics Data System (ADS)

    Munsky, Brian; Shepherd, Douglas

    2014-03-01

    Single-cell, single-molecule experiments are producing an unprecedented amount of data to capture the dynamics of biological systems. When integrated with computational models, observations of spatial, temporal and stochastic fluctuations can yield powerful quantitative insight. We concentrate on experiments that localize and count individual molecules of mRNA. These high precision experiments have large imaging and computational processing costs, and we explore how improved computational analyses can dramatically reduce overall data requirements. In particular, we show how analyses of spatial, temporal and stochastic fluctuations can significantly enhance parameter estimation results for small, noisy data sets. We also show how full probability distribution analyses can constrain parameters with far less data than bulk analyses or statistical moment closures. Finally, we discuss how a systematic modeling progression from simple to more complex analyses can reduce total computational costs by orders of magnitude. We illustrate our approach using single-molecule, spatial mRNA measurements of Interleukin 1-alpha mRNA induction in human THP1 cells following stimulation. Our approach could improve the effectiveness of single-molecule gene regulation analyses for many other process.

  11. Bias and inference from misspecified mixed-effect models in stepped wedge trial analysis.

    PubMed

    Thompson, Jennifer A; Fielding, Katherine L; Davey, Calum; Aiken, Alexander M; Hargreaves, James R; Hayes, Richard J

    2017-10-15

    Many stepped wedge trials (SWTs) are analysed by using a mixed-effect model with a random intercept and fixed effects for the intervention and time periods (referred to here as the standard model). However, it is not known whether this model is robust to misspecification. We simulated SWTs with three groups of clusters and two time periods; one group received the intervention during the first period and two groups in the second period. We simulated period and intervention effects that were either common-to-all or varied-between clusters. Data were analysed with the standard model or with additional random effects for period effect or intervention effect. In a second simulation study, we explored the weight given to within-cluster comparisons by simulating a larger intervention effect in the group of the trial that experienced both the control and intervention conditions and applying the three analysis models described previously. Across 500 simulations, we computed bias and confidence interval coverage of the estimated intervention effect. We found up to 50% bias in intervention effect estimates when period or intervention effects varied between clusters and were treated as fixed effects in the analysis. All misspecified models showed undercoverage of 95% confidence intervals, particularly the standard model. A large weight was given to within-cluster comparisons in the standard model. In the SWTs simulated here, mixed-effect models were highly sensitive to departures from the model assumptions, which can be explained by the high dependence on within-cluster comparisons. Trialists should consider including a random effect for time period in their SWT analysis model. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  12. Bias and inference from misspecified mixed‐effect models in stepped wedge trial analysis

    PubMed Central

    Fielding, Katherine L.; Davey, Calum; Aiken, Alexander M.; Hargreaves, James R.; Hayes, Richard J.

    2017-01-01

    Many stepped wedge trials (SWTs) are analysed by using a mixed‐effect model with a random intercept and fixed effects for the intervention and time periods (referred to here as the standard model). However, it is not known whether this model is robust to misspecification. We simulated SWTs with three groups of clusters and two time periods; one group received the intervention during the first period and two groups in the second period. We simulated period and intervention effects that were either common‐to‐all or varied‐between clusters. Data were analysed with the standard model or with additional random effects for period effect or intervention effect. In a second simulation study, we explored the weight given to within‐cluster comparisons by simulating a larger intervention effect in the group of the trial that experienced both the control and intervention conditions and applying the three analysis models described previously. Across 500 simulations, we computed bias and confidence interval coverage of the estimated intervention effect. We found up to 50% bias in intervention effect estimates when period or intervention effects varied between clusters and were treated as fixed effects in the analysis. All misspecified models showed undercoverage of 95% confidence intervals, particularly the standard model. A large weight was given to within‐cluster comparisons in the standard model. In the SWTs simulated here, mixed‐effect models were highly sensitive to departures from the model assumptions, which can be explained by the high dependence on within‐cluster comparisons. Trialists should consider including a random effect for time period in their SWT analysis model. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28556355

  13. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  14. Phenotypes of intermediate forms of Fasciola hepatica and F. gigantica in buffaloes from Central Punjab, Pakistan.

    PubMed

    Afshan, K; Valero, M A; Qayyum, M; Peixoto, R V; Magraner, A; Mas-Coma, S

    2014-12-01

    Fascioliasis is an important food-borne parasitic disease caused by the two trematode species, Fasciola hepatica and Fasciola gigantica. The phenotypic features of fasciolid adults and eggs infecting buffaloes inhabiting the Central Punjab area, Pakistan, have been studied to characterize fasciolid populations involved. Morphometric analyses were made with a computer image analysis system (CIAS) applied on the basis of standardized measurements. Since it is the first study of this kind undertaken in Pakistan, the results are compared to pure fasciolid populations: (a) F. hepatica from the European Mediterranean area; and (b) F. gigantica from Burkina Faso; i.e. geographical areas where both species do not co-exist. Only parasites obtained from bovines were used. The multivariate analysis showed that the characteristics, including egg morphometrics, of fasciolids from Central Punjab, Pakistan, are between F. hepatica and F. gigantica standard populations. Similarly, the morphometric measurements of fasciolid eggs from Central Punjab are also between F. hepatica and F. gigantica standard populations. These results demonstrate the existence of fasciolid intermediate forms in endemic areas in Pakistan.

  15. KENO-VI Primer: A Primer for Criticality Calculations with SCALE/KENO-VI Using GeeWiz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, Stephen M

    2008-09-01

    The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for criticality safety analyses. The well-known KENO-VI three-dimensional Monte Carlo criticality computer code is one of the primary criticality safety analysis tools in SCALE. The KENO-VI primer is designed to help a new user understand and use the SCALE/KENO-VI Monte Carlo code for nuclear criticality safety analyses. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with SCALE/KENO-VImore » in particular. The primer is designed to teach by example, with each example illustrating two or three features of SCALE/KENO-VI that are useful in criticality analyses. The primer is based on SCALE 6, which includes the Graphically Enhanced Editing Wizard (GeeWiz) Windows user interface. Each example uses GeeWiz to provide the framework for preparing input data and viewing output results. Starting with a Quickstart section, the primer gives an overview of the basic requirements for SCALE/KENO-VI input and allows the user to quickly run a simple criticality problem with SCALE/KENO-VI. The sections that follow Quickstart include a list of basic objectives at the beginning that identifies the goal of the section and the individual SCALE/KENO-VI features that are covered in detail in the sample problems in that section. Upon completion of the primer, a new user should be comfortable using GeeWiz to set up criticality problems in SCALE/KENO-VI. The primer provides a starting point for the criticality safety analyst who uses SCALE/KENO-VI. Complete descriptions are provided in the SCALE/KENO-VI manual. Although the primer is self-contained, it is intended as a companion volume to the SCALE/KENO-VI documentation. (The SCALE manual is provided on the SCALE installation DVD.) The primer provides specific examples of using SCALE/KENO-VI for criticality analyses; the SCALE/KENO-VI manual provides information on the use of SCALE/KENO-VI and all its modules. The primer also contains an appendix with sample input files.« less

  16. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  17. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  18. Statistical analysis of solid waste composition data: Arithmetic mean, standard deviation and correlation coefficients.

    PubMed

    Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2017-11-01

    Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Computer-delivered and web-based interventions to improve depression, anxiety, and psychological well-being of university students: a systematic review and meta-analysis.

    PubMed

    Davies, E Bethan; Morriss, Richard; Glazebrook, Cris

    2014-05-16

    Depression and anxiety are common mental health difficulties experienced by university students and can impair academic and social functioning. Students are limited in seeking help from professionals. As university students are highly connected to digital technologies, Web-based and computer-delivered interventions could be used to improve students' mental health. The effectiveness of these intervention types requires investigation to identify whether these are viable prevention strategies for university students. The intent of the study was to systematically review and analyze trials of Web-based and computer-delivered interventions to improve depression, anxiety, psychological distress, and stress in university students. Several databases were searched using keywords relating to higher education students, mental health, and eHealth interventions. The eligibility criteria for studies included in the review were: (1) the study aimed to improve symptoms relating to depression, anxiety, psychological distress, and stress, (2) the study involved computer-delivered or Web-based interventions accessed via computer, laptop, or tablet, (3) the study was a randomized controlled trial, and (4) the study was trialed on higher education students. Trials were reviewed and outcome data analyzed through random effects meta-analyses for each outcome and each type of trial arm comparison. Cochrane Collaboration risk of bias tool was used to assess study quality. A total of 17 trials were identified, in which seven were the same three interventions on separate samples; 14 reported sufficient information for meta-analysis. The majority (n=13) were website-delivered and nine interventions were based on cognitive behavioral therapy (CBT). A total of 1795 participants were randomized and 1480 analyzed. Risk of bias was considered moderate, as many publications did not sufficiently report their methods and seven explicitly conducted completers' analyses. In comparison to the inactive control, sensitivity meta-analyses supported intervention in improving anxiety (pooled standardized mean difference [SMD] -0.56; 95% CI -0.77 to -0.35, P<.001), depression (pooled SMD -0.43; 95% CI -0.63 to -0.22, P<.001), and stress (pooled SMD -0.73; 95% CI -1.27 to -0.19, P=.008). In comparison to active controls, sensitivity analyses did not support either condition for anxiety (pooled SMD -0.18; 95% CI -0.98 to 0.62, P=.66) or depression (pooled SMD -0.28; 95% CI -0.75 to -0.20, P=.25). In contrast to a comparison intervention, neither condition was supported in sensitivity analyses for anxiety (pooled SMD -0.10; 95% CI -0.39 to 0.18, P=.48) or depression (pooled SMD -0.33; 95% CI -0.43 to 1.09, P=.40). The findings suggest Web-based and computer-delivered interventions can be effective in improving students' depression, anxiety, and stress outcomes when compared to inactive controls, but some caution is needed when compared to other trial arms and methodological issues were noticeable. Interventions need to be trialed on more heterogeneous student samples and would benefit from user evaluation. Future trials should address methodological considerations to improve reporting of trial quality and address post-intervention skewed data.

  20. ResidPlots-2: Computer Software for IRT Graphical Residual Analyses

    ERIC Educational Resources Information Center

    Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.

    2009-01-01

    This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…

  1. D Recording for 2d Delivering - the Employment of 3d Models for Studies and Analyses -

    NASA Astrophysics Data System (ADS)

    Rizzi, A.; Baratti, G.; Jiménez, B.; Girardi, S.; Remondino, F.

    2011-09-01

    In the last years, thanks to the advances of surveying sensors and techniques, many heritage sites could be accurately replicated in digital form with very detailed and impressive results. The actual limits are mainly related to hardware capabilities, computation time and low performance of personal computer. Often, the produced models are not visible on a normal computer and the only solution to easily visualized them is offline using rendered videos. This kind of 3D representations is useful for digital conservation, divulgation purposes or virtual tourism where people can visit places otherwise closed for preservation or security reasons. But many more potentialities and possible applications are available using a 3D model. The problem is the ability to handle 3D data as without adequate knowledge this information is reduced to standard 2D data. This article presents some surveying and 3D modeling experiences within the APSAT project ("Ambiente e Paesaggi dei Siti d'Altura Trentini", i.e. Environment and Landscapes of Upland Sites in Trentino). APSAT is a multidisciplinary project funded by the Autonomous Province of Trento (Italy) with the aim documenting, surveying, studying, analysing and preserving mountainous and hill-top heritage sites located in the region. The project focuses on theoretical, methodological and technological aspects of the archaeological investigation of mountain landscape, considered as the product of sequences of settlements, parcelling-outs, communication networks, resources, and symbolic places. The mountain environment preserves better than others the traces of hunting and gathering, breeding, agricultural, metallurgical, symbolic activities characterised by different lengths and environmental impacts, from Prehistory to the Modern Period. Therefore the correct surveying and documentation of this heritage sites and material is very important. Within the project, the 3DOM unit of FBK is delivering all the surveying and 3D material to the interdisciplinary partners of the project to allow successive analyses or derivations of restoration plans and conservation policies.

  2. 3D Position and Velocity Vector Computations of Objects Jettisoned from the International Space Station Using Close-Range Photogrammetry Approach

    NASA Technical Reports Server (NTRS)

    Papanyan, Valeri; Oshle, Edward; Adamo, Daniel

    2008-01-01

    Measurement of the jettisoned object departure trajectory and velocity vector in the International Space Station (ISS) reference frame is vitally important for prompt evaluation of the object s imminent orbit. We report on the first successful application of photogrammetric analysis of the ISS imagery for the prompt computation of the jettisoned object s position and velocity vectors. As post-EVA analyses examples, we present the Floating Potential Probe (FPP) and the Russian "Orlan" Space Suit jettisons, as well as the near-real-time (provided in several hours after the separation) computations of the Video Stanchion Support Assembly Flight Support Assembly (VSSA-FSA) and Early Ammonia Servicer (EAS) jettisons during the US astronauts space-walk. Standard close-range photogrammetry analysis was used during this EVA to analyze two on-board camera image sequences down-linked from the ISS. In this approach the ISS camera orientations were computed from known coordinates of several reference points on the ISS hardware. Then the position of the jettisoned object for each time-frame was computed from its image in each frame of the video-clips. In another, "quick-look" approach used in near-real time, orientation of the cameras was computed from their position (from the ISS CAD model) and operational data (pan and tilt) then location of the jettisoned object was calculated only for several frames of the two synchronized movies. Keywords: Photogrammetry, International Space Station, jettisons, image analysis.

  3. An inherited FGFR2 mutation increased osteogenesis gene expression and result in Crouzon syndrome.

    PubMed

    Fan, Jiayan; Li, Yinwei; Jia, Renbing; Fan, Xianqun

    2018-05-30

    FGFR2 encodes a fibroblast growth factor receptor whose mutations are responsible for the Crouzon syndrome, involving craniosynostosis and facial dysostosis with shallow orbits. However, few reports are available quantifying the orbital volume of Crouzon syndrome and there was little direct evidence to show FGFR2 mutation actually influencing orbital morphology. Ten Crouzon syndrome patients underwent a standard ophthalmologic assessment. Morphology study was carried out based on 3-dimensional computed tomography scan to calculate orbital volume. Genomic DNA was extracted from peripheral blood leukocytes of the patients and genomic screening of FGFR2. A three-dimensional computer model was used to analyse the structural positioning of the mutation site that was predicted possible impact on functional of FGFR2 protein. Real-time PCR was performed to analyse the expression of bone maker gene. We describe a FGFR2 mutation (p.G338R, c.1012G > C) in a Chinese family with Crouzon syndrome. Computational analysis showed the mutate protein obviously changes in the local spatial structure compared with wild-type FGFR2. The expression of osteocalcin and alkaline phosphatase two osteoblast specific genes significantly increased in orbital bone directly from patient compared to normal individual, which may lead to facial dysostosis. This is compatible with the shallow and round orbits in our Crouzon syndrome patient. Our study further identified G338R FGFR2 mutation (c1012G > C) lead to inherited Crouzon syndrome. Thus, early intervention, both medically and surgically, as well as disciplined by a multiple interdisciplinary teams are crucial to the management of this disorder.

  4. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    NASA Astrophysics Data System (ADS)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.

  5. Are physical activity studies in Hispanics meeting reporting guidelines for continuous monitoring technology? A systematic review.

    PubMed

    Layne, Charles S; Parker, Nathan H; Soltero, Erica G; Rosales Chavez, José; O'Connor, Daniel P; Gallagher, Martina R; Lee, Rebecca E

    2015-09-18

    Continuous monitoring technologies such as accelerometers and pedometers are the gold standard for physical activity (PA) measurement. However, inconsistencies in use, analysis, and reporting limit the understanding of dose-response relationships involving PA and the ability to make comparisons across studies and population subgroups. These issues are particularly detrimental to the study of PA across different ethnicities with different PA habits. This systematic review examined the inclusion of published guidelines involving data collection, processing, and reporting among articles using accelerometers or pedometers in Hispanic or Latino populations. English (PubMed; EbscoHost) and Spanish (SCIELO; Biblioteca Virtual en Salud) articles published between 2000 and 2013 using accelerometers or pedometers to measure PA among Hispanics or Latinos were identified through systematic literature searches. Of the 253 abstracts which were initially reviewed, 57 met eligibility criteria (44 accelerometer, 13 pedometer). Articles were coded and reviewed to evaluate compliance with recommended guidelines (N = 20), and the percentage of accelerometer and pedometer articles following each guideline were computed and reported. On average, 57.1 % of accelerometer and 62.2 % of pedometer articles reported each recommended guideline for data collection. Device manufacturer and model were reported most frequently, and provision of instructions for device wear in Spanish was reported least frequently. On average, 29.6 % of accelerometer articles reported each guideline for data processing. Definitions of an acceptable day for inclusion in analyses were reported most frequently, and definitions of an acceptable hour for inclusion in analyses were reported least frequently. On average, 18.8 % of accelerometer and 85.7 % of pedometer articles included each guideline for data reporting. Accelerometer articles most frequently included average number of valid days and least frequently included percentage of wear time. Inclusion of standard collection and reporting procedures in studies using continuous monitoring devices in Hispanic or Latino population is generally low. Lack of reporting consistency in continuous monitoring studies limits researchers' ability to compare studies or draw meaningful conclusions concerning amounts, quality, and benefits of PA among Hispanic or Latino populations. Reporting data collection, computation, and decision-making standards should be required. Improved interpretability would allow practitioners and researchers to apply scientific findings to promote PA.

  6. A computer graphics program for general finite element analyses

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Sawyer, L. M.

    1978-01-01

    Documentation for a computer graphics program for displays from general finite element analyses is presented. A general description of display options and detailed user instructions are given. Several plots made in structural, thermal and fluid finite element analyses are included to illustrate program options. Sample data files are given to illustrate use of the program.

  7. Femoral articular shape and geometry. A three-dimensional computerized analysis of the knee.

    PubMed

    Siu, D; Rudan, J; Wevers, H W; Griffiths, P

    1996-02-01

    An average, three-dimensional anatomic shape and geometry of the distal femur were generated from x-ray computed tomography data of five fresh asymptomatic cadaver knees using AutoCAD (AutoDesk, Sausalito, CA), a computer-aided design and drafting software. Each femur model was graphically repositioned to a standardized orientation using a series of alignment templates and scaled to a nominal size of 85 mm in mediolateral and 73 mm in anteroposterior dimensions. An average generic shape of the distal femur was synthesized by combining these pseudosolid models and reslicing the composite structure at different elevations using clipping and smoothing techniques in interactive computer graphics. The resulting distal femoral geometry was imported into a computer-aided manufacturing system, and anatomic prototypes of the distal femur were produced. Quantitative geometric analyses of the generic femur in the coronal and transverse planes revealed definite condylar camber (3 degrees-6 degrees) and toe-in (8 degrees-10 degrees) with an oblique patellofemoral groove (15 degrees) with respect to the mechanical axis of the femur. In the sagittal plane, each condyle could be approximated by three concatenated circular arcs (anterior, distal, and posterior) with slope continuity and a single arc for the patellofemoral groove. The results of this study may have important implications in future femoral prosthesis design and clinical applications.

  8. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 1. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  9. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 2. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  10. Novel 3D ultrasound image-based biomarkers based on a feature selection from a 2D standardized vessel wall thickness map: a tool for sensitive assessment of therapies for carotid atherosclerosis

    NASA Astrophysics Data System (ADS)

    Chiu, Bernard; Li, Bing; Chow, Tommy W. S.

    2013-09-01

    With the advent of new therapies and management strategies for carotid atherosclerosis, there is a parallel need for measurement tools or biomarkers to evaluate the efficacy of these new strategies. 3D ultrasound has been shown to provide reproducible measurements of plaque area/volume and vessel wall volume. However, since carotid atherosclerosis is a focal disease that predominantly occurs at bifurcations, biomarkers based on local plaque change may be more sensitive than global volumetric measurements in demonstrating efficacy of new therapies. The ultimate goal of this paper is to develop a biomarker that is based on the local distribution of vessel-wall-plus-plaque thickness change (VWT-Change) that has occurred during the course of a clinical study. To allow comparison between different treatment groups, the VWT-Change distribution of each subject must first be mapped to a standardized domain. In this study, we developed a technique to map the 3D VWT-Change distribution to a 2D standardized template. We then applied a feature selection technique to identify regions on the 2D standardized map on which subjects in different treatment groups exhibit greater difference in VWT-Change. The proposed algorithm was applied to analyse the VWT-Change of 20 subjects in a placebo-controlled study of the effect of atorvastatin (Lipitor). The average VWT-Change for each subject was computed (i) over all points in the 2D map and (ii) over feature points only. For the average computed over all points, 97 subjects per group would be required to detect an effect size of 25% that of atorvastatin in a six-month study. The sample size is reduced to 25 subjects if the average were computed over feature points only. The introduction of this sensitive quantification technique for carotid atherosclerosis progression/regression would allow many proof-of-principle studies to be performed before a more costly and longer study involving a larger population is held to confirm the treatment efficacy.

  11. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  12. Adaptive MCMC in Bayesian phylogenetics: an application to analyzing partitioned data in BEAST.

    PubMed

    Baele, Guy; Lemey, Philippe; Rambaut, Andrew; Suchard, Marc A

    2017-06-15

    Advances in sequencing technology continue to deliver increasingly large molecular sequence datasets that are often heavily partitioned in order to accurately model the underlying evolutionary processes. In phylogenetic analyses, partitioning strategies involve estimating conditionally independent models of molecular evolution for different genes and different positions within those genes, requiring a large number of evolutionary parameters that have to be estimated, leading to an increased computational burden for such analyses. The past two decades have also seen the rise of multi-core processors, both in the central processing unit (CPU) and Graphics processing unit processor markets, enabling massively parallel computations that are not yet fully exploited by many software packages for multipartite analyses. We here propose a Markov chain Monte Carlo (MCMC) approach using an adaptive multivariate transition kernel to estimate in parallel a large number of parameters, split across partitioned data, by exploiting multi-core processing. Across several real-world examples, we demonstrate that our approach enables the estimation of these multipartite parameters more efficiently than standard approaches that typically use a mixture of univariate transition kernels. In one case, when estimating the relative rate parameter of the non-coding partition in a heterochronous dataset, MCMC integration efficiency improves by > 14-fold. Our implementation is part of the BEAST code base, a widely used open source software package to perform Bayesian phylogenetic inference. guy.baele@kuleuven.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. INTEGRATION OF SYSTEMS GLYCOBIOLOGY WITH BIOINFORMATICS TOOLBOXES, GLYCOINFORMATICS RESOURCES AND GLYCOPROTEOMICS DATA

    PubMed Central

    Liu, Gang; Neelamegham, Sriram

    2015-01-01

    The glycome constitutes the entire complement of free carbohydrates and glycoconjugates expressed on whole cells or tissues. ‘Systems Glycobiology’ is an emerging discipline that aims to quantitatively describe and analyse the glycome. Here, instead of developing a detailed understanding of single biochemical processes, a combination of computational and experimental tools are used to seek an integrated or ‘systems-level’ view. This can explain how multiple biochemical reactions and transport processes interact with each other to control glycome biosynthesis and function. Computational methods in this field commonly build in silico reaction network models to describe experimental data derived from structural studies that measure cell-surface glycan distribution. While considerable progress has been made, several challenges remain due to the complex and heterogeneous nature of this post-translational modification. First, for the in silico models to be standardized and shared among laboratories, it is necessary to integrate glycan structure information and glycosylation-related enzyme definitions into the mathematical models. Second, as glycoinformatics resources grow, it would be attractive to utilize ‘Big Data’ stored in these repositories for model construction and validation. Third, while the technology for profiling the glycome at the whole-cell level has been standardized, there is a need to integrate mass spectrometry derived site-specific glycosylation data into the models. The current review discusses progress that is being made to resolve the above bottlenecks. The focus is on how computational models can bridge the gap between ‘data’ generated in wet-laboratory studies with ‘knowledge’ that can enhance our understanding of the glycome. PMID:25871730

  14. Rapid and Robust Cross-Correlation-Based Seismic Signal Identification Using an Approximate Nearest Neighbor Method

    DOE PAGES

    Tibi, Rigobert; Young, Christopher; Gonzales, Antonio; ...

    2017-07-04

    The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less

  15. Rapid and Robust Cross-Correlation-Based Seismic Signal Identification Using an Approximate Nearest Neighbor Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tibi, Rigobert; Young, Christopher; Gonzales, Antonio

    The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less

  16. Current Perspectives in Imaging Modalities for the Assessment of Unruptured Intracranial Aneurysms: A Comparative Analysis and Review.

    PubMed

    Turan, Nefize; Heider, Robert A; Roy, Anil K; Miller, Brandon A; Mullins, Mark E; Barrow, Daniel L; Grossberg, Jonathan; Pradilla, Gustavo

    2018-05-01

    Intracranial aneurysms (IAs) are pathologic dilatations of cerebral arteries. This systematic review summarizes and compares imaging techniques for assessing unruptured IAs (UIAs). This review also addresses their uses in different scopes of practice. Pathophysiologic mechanisms are reviewed to better understand the clinical usefulness of each imaging modality. A literature review was performed using PubMed with these search terms: "intracranial aneurysm," "cerebral aneurysm," "magnetic resonance angiography (MRA)," computed tomography angiography (CTA)," "catheter angiography," "digital subtraction angiography," "molecular imaging," "ferumoxytol," and "myeloperoxidase". Only studies in English were cited. Since the development and improvement of noninvasive diagnostic imaging (computed tomography angiography and magnetic resonance angiography), many prospective studies and meta-analyses have compared these tests with gold standard digital subtraction angiography (DSA). Although computed tomography angiography and magnetic resonance angiography have lower detection rates for UIAs, they are vital in the treatment and follow-up of UIAs. The reduction in ionizing radiation and lack of endovascular instrumentation with these modalities provide benefits compared with DSA. Novel molecular imaging techniques to detect inflammation within the aneurysmal wall with the goal of stratifying risk based on level of inflammation are under investigation. DSA remains the gold standard for preoperative planning and follow-up for patients with IA. Newer imaging modalities such as ferumoxytol-enhanced magnetic resonance imaging are emerging techniques that provide critical in vivo information about the inflammatory milieu within aneurysm walls. With further study, these techniques may provide aneurysm rupture risk and prediction models for individualized patient care. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Use of Coronary Computed Tomographic Angiography to Guide Management of Patients With Coronary Disease

    PubMed Central

    Williams, Michelle C.; Hunter, Amanda; Shah, Anoop S.V.; Assi, Valentina; Lewis, Stephanie; Smith, Joel; Berry, Colin; Boon, Nicholas A.; Clark, Elizabeth; Flather, Marcus; Forbes, John; McLean, Scott; Roditi, Giles; van Beek, Edwin J.R.; Timmis, Adam D.; Newby, David E.

    2016-01-01

    Background In a prospective, multicenter, randomized controlled trial, 4,146 patients were randomized to receive standard care or standard care plus coronary computed tomography angiography (CCTA). Objectives The purpose of this study was to explore the consequences of CCTA-assisted diagnosis on invasive coronary angiography, preventive treatments, and clinical outcomes. Methods In post hoc analyses, we assessed changes in invasive coronary angiography, preventive treatments, and clinical outcomes using national electronic health records. Results Despite similar overall rates (409 vs. 401; p = 0.451), invasive angiography was less likely to demonstrate normal coronary arteries (20 vs. 56; hazard ratios [HRs]: 0.39 [95% confidence interval (CI): 0.23 to 0.68]; p < 0.001) but more likely to show obstructive coronary artery disease (283 vs. 230; HR: 1.29 [95% CI: 1.08 to 1.55]; p = 0.005) in those allocated to CCTA. More preventive therapies (283 vs. 74; HR: 4.03 [95% CI: 3.12 to 5.20]; p < 0.001) were initiated after CCTA, with each drug commencing at a median of 48 to 52 days after clinic attendance. From the median time for preventive therapy initiation (50 days), fatal and nonfatal myocardial infarction was halved in patients allocated to CCTA compared with those assigned to standard care (17 vs. 34; HR: 0.50 [95% CI: 0.28 to 0.88]; p = 0.020). Cumulative 6-month costs were slightly higher with CCTA: difference $462 (95% CI: $303 to $621). Conclusions In patients with suspected angina due to coronary heart disease, CCTA leads to more appropriate use of invasive angiography and alterations in preventive therapies that were associated with a halving of fatal and non-fatal myocardial infarction. (Scottish COmputed Tomography of the HEART Trial [SCOT-HEART]; NCT01149590) PMID:27081014

  18. Standard atomic volumes in double-stranded DNA and packing in protein–DNA interfaces

    PubMed Central

    Nadassy, Katalin; Tomás-Oliveira, Isabel; Alberts, Ian; Janin, Joël; Wodak, Shoshana J.

    2001-01-01

    Standard volumes for atoms in double-stranded B-DNA are derived using high resolution crystal structures from the Nucleic Acid Database (NDB) and compared with corresponding values derived from crystal structures of small organic compounds in the Cambridge Structural Database (CSD). Two different methods are used to compute these volumes: the classical Voronoi method, which does not depend on the size of atoms, and the related Radical Planes method which does. Results show that atomic groups buried in the interior of double-stranded DNA are, on average, more tightly packed than in related small molecules in the CSD. The packing efficiency of DNA atoms at the interfaces of 25 high resolution protein–DNA complexes is determined by computing the ratios between the volumes of interfacial DNA atoms and the corresponding standard volumes. These ratios are found to be close to unity, indicating that the DNA atoms at protein–DNA interfaces are as closely packed as in crystals of B-DNA. Analogous volume ratios, computed for buried protein atoms, are also near unity, confirming our earlier conclusions that the packing efficiency of these atoms is similar to that in the protein interior. In addition, we examine the number, volume and solvent occupation of cavities located at the protein–DNA interfaces and compared them with those in the protein interior. Cavities are found to be ubiquitous in the interfaces as well as inside the protein moieties. The frequency of solvent occupation of cavities is however higher in the interfaces, indicating that those are more hydrated than protein interiors. Lastly, we compare our results with those obtained using two different measures of shape complementarity of the analysed interfaces, and find that the correlation between our volume ratios and these measures, as well as between the measures themselves, is weak. Our results indicate that a tightly packed environment made up of DNA, protein and solvent atoms plays a significant role in protein–DNA recognition. PMID:11504874

  19. Applications of Micro-CT scanning in medicine and dentistry: Microstructural analyses of a Wistar Rat mandible and a urinary tract stone

    NASA Astrophysics Data System (ADS)

    Latief, F. D. E.; Sari, D. S.; Fitri, L. A.

    2017-08-01

    High-resolution tomographic imaging by means of x-ray micro-computed tomography (μCT) has been widely utilized for morphological evaluations in dentistry and medicine. The use of μCT follows a standard procedure: image acquisition, reconstruction, processing, evaluation using image analysis, and reporting of results. This paper discusses methods of μCT using a specific scanning device, the Bruker SkyScan 1173 High Energy Micro-CT. We present a description of the general workflow, information on terminology for the measured parameters and corresponding units, and further analyses that can potentially be conducted with this technology. Brief qualitative and quantitative analyses, including basic image processing (VOI selection and thresholding) and measurement of several morphometrical variables (total VOI volume, object volume, percentage of total volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity) were conducted on two samples, the mandible of a wistar rat and a urinary tract stone, to illustrate the abilities of this device and its accompanying software package. The results of these analyses for both samples are reported, along with a discussion of the types of analyses that are possible using digital images obtained with a μCT scanning device, paying particular attention to non-diagnostic ex vivo research applications.

  20. Exploring a Black Body Source as an Absolute Radiometric Calibration Standard and Comparison with a NIST Traced Lamp Standard

    NASA Technical Reports Server (NTRS)

    Green, Robert O.; Chrien, Thomas; Sarture, Chuck

    2001-01-01

    Radiometric calibration of the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) is required for the scientific research and application objectives pursued with the spectroscopic measurements. Specifically calibration is required for: inter-comparison of AVIRIS data measured at different locations and at different times; analysis of AVIRIS data with data measured by other instruments; and analysis of AVIRIS data in conjunction with computer models. The primary effect of radiometric calibration is conversion of AVIRIS instrument response values (digitized numbers, or DN) to units of absolute radiance. For example, a figure shows the instrument response spectrum measured by AVIRIS over a portion of Rogers Dry Lake, California, and another figure shows the same spectrum calibrated to radiance. Only the calibrated spectrum may be quantitatively analyzed for science research and application objectives. Since the initial development of the AVIRIS instrument-radiometric calibration has been based upon a 1000-W irradiance lamp with a calibration traced to the National Institute of Standards and Technology (NIST). There are several advantages to this irradiance-lamp calibration approach. First, the considerable effort of NIST backs up the calibration. Second, by changing the distance to the lamp, the output can closely span the radiance levels measured by AVIRIS. Third, this type of standard is widely used. Fourth, these calibrated lamps are comparatively inexpensive. Conversely, there are several disadvantages to this approach as well. First, the lamp is not a primary standard. Second, the lamp output characteristics may change in an unknown manner through time. Third, it is difficult to assess, constrain, or improve the calibration uncertainty delivered with the lamp. In an attempt to explore the effect and potentially address some of these disadvantages a set of analyses and measurements comparing an irradiance lamp with a black-body source have been completed. This research is ongoing, and the current set of measurements, analyses, and results are presented in this paper.

  1. A Windows application for computing standardized mortality ratios and standardized incidence ratios in cohort studies based on calculation of exact person-years at risk.

    PubMed

    Geiss, Karla; Meyer, Martin

    2013-09-01

    Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. No Evidence for Extensions to the Standard Cosmological Model.

    PubMed

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-08

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (ΛCDM) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (lnB=-7.8), nonzero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), nonstandard numbers of neutrinos (lnB=-3.1), nonstandard neutrino masses (lnB=-3.2), nonstandard lensing potential (lnB=-4.6), evolving dark energy (lnB=-3.2), sterile neutrinos (lnB=-6.9), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (lnB=-10.8). Other models are less strongly disfavored with respect to flat ΛCDM. As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does ΛCDM become disfavored, and only mildly, compared with a dynamical dark energy model (lnB∼+2).

  3. No Evidence for Extensions to the Standard Cosmological Model

    NASA Astrophysics Data System (ADS)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  4. The evaluation of a novel haptic-enabled virtual reality approach for computer-aided cephalometry.

    PubMed

    Medellín-Castillo, H I; Govea-Valladares, E H; Pérez-Guerrero, C N; Gil-Valladares, J; Lim, Theodore; Ritchie, James M

    2016-07-01

    In oral and maxillofacial surgery, conventional radiographic cephalometry is one of the standard auxiliary tools for diagnosis and surgical planning. While contemporary computer-assisted cephalometric systems and methodologies support cephalometric analysis, they tend neither to be practical nor intuitive for practitioners. This is particularly the case for 3D methods since the associated landmarking process is difficult and time consuming. In addition to this, there are no 3D cephalometry norms or standards defined; therefore new landmark selection methods are required which will help facilitate their establishment. This paper presents and evaluates a novel haptic-enabled landmarking approach to overcome some of the difficulties and disadvantages of the current landmarking processes used in 2D and 3D cephalometry. In order to evaluate this new system's feasibility and performance, 21 dental surgeons (comprising 7 Novices, 7 Semi-experts and 7 Experts) performed a range of case studies using a haptic-enabled 2D, 2½D and 3D digital cephalometric analyses. The results compared the 2D, 2½D and 3D cephalometric values, errors and standard deviations for each case study and associated group of participants and revealed that 3D cephalometry significantly reduced landmarking errors and variability compared to 2D methods. Through enhancing the process by providing a sense of touch, the haptic-enabled 3D digital cephalometric approach was found to be feasible and more intuitive than its counterparts as well effective at reducing errors, the variability of the measurements taken and associated task completion times. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. The validity of transverse intermaxillary analysis by traditional PA cephalometry compared with cone-beam computed tomography.

    PubMed

    Cheung, Gordon; Goonewardene, Mithran Suresh; Islam, Syed Mohammed Shamsul; Murray, Kevin; Koong, Bernard

    2013-05-01

    To assess the validity of using jugale (J) and Antegonion (Ag) on Posterior-Anterior cephalograms (PAC) as landmarks for transverse intermaxillary analysis when compared with Cone Beam Computed Tomography (CBCT). Conventional PAC and CBCT images were taken of 28 dry skulls. Craniometric measurements between the bilateral landmarks, Antegonion and Jugale, were obtained from the skulls using a microscribe and recorded as the base standard. The corresponding andmarks were identified and measured on CBCT and PAC and compared with the base standard measurements. The accuracy and reliability of the measurements were statistically evaluated and the validity was assessed by comparing the ability of the two image modalities to accurately diagnose an arbitrarily selected J-J/Ag-Ag ratio. All measurements were repeated at least 7 weeks apart. Intra-class correlations (ICC) and Bland-Altman plots were used to analyse the data. All three methods were shown to be reliable as all had a mean error of less than 0.5 mm between repeated measurements. When compared with the base standard, CBCT measurements were shown to have higher agreement (ICC: 0.861-0.964) compared with measurements taken from PAC (ICC: 0.794-0.796). When the arbitrary J-J/Ag-Ag ratio was assessed, 18 per cent of cases were incorrectly diagnosed with a transverse discrepancy on the PAC compared with the CBCT which incorrectly diagnosed 8.7 per cent. CBCT was shown to be more reliable in assessing intermaxillary transverse discrepancy compared with PAC when using J-J/Ag-Ag ratios.

  6. Science Teacher Efficacy and Extrinsic Factors Toward Professional Development Using Video Games in a Design-Based Research Model: The Next Generation of STEM Learning

    NASA Astrophysics Data System (ADS)

    Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu

    2013-02-01

    Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.

  7. 77 FR 37733 - Technical Standard Order (TSO)-C68a, Airborne Automatic Dead Reckoning Computer Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ..., Airborne Automatic Dead Reckoning Computer Equipment Utilizing Aircraft Heading and Doppler Ground Speed.... ACTION: Notice of intent to cancel Technical Standard Order (TSO)-C68a, Airborne automatic dead reckoning... dead reckoning computer equipment utilizing aircraft heading and Doppler ground speed and drift angle...

  8. Computer Ethics: A Slow Fade from Black and White to Shades of Gray

    ERIC Educational Resources Information Center

    Kraft, Theresa A.; Carlisle, Judith

    2011-01-01

    The expanded use of teaching case based analysis based on current events and news stories relating to computer ethics improves student engagement, encourages creativity and fosters an active learning environment. Professional ethics standards, accreditation standards for computer curriculum, ethics theories, resources for ethics on the internet,…

  9. Computer Security and the Data Encryption Standard. Proceedings of the Conference on Computer Security and the Data Encryption Standard.

    ERIC Educational Resources Information Center

    Branstad, Dennis K., Ed.

    The 15 papers and summaries of presentations in this collection provide technical information and guidance offered by representatives from federal agencies and private industry. Topics discussed include physical security, risk assessment, software security, computer network security, and applications and implementation of the Data Encryption…

  10. LINCS: Livermore's network architecture. [Octopus computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less

  11. National Irrigation Water Quality Program data-synthesis data base

    USGS Publications Warehouse

    Seiler, Ralph L.; Skorupa, Joseph P.

    2001-01-01

    Under the National Irrigation Water Quality Program (NIWQP) of the U.S. Department of the Interior, researchers investigated contamination caused by irrigation drainage in 26 areas in the Western United States from 1986 to 1993. From 1992 to 1995, a comprehensive relational data base was built to organize data collected during the 26-area investigations. The data base provided the basis for analysis and synthesis of these data to identify common features of contaminated areas and hence dominant biologic, geologic, climatic, chemical, and physiographic factors that have resulted in contamination of water and biota in irrigated areas in the Western United States. Included in the data base are geologic, hydrologic, climatological, chemical, and cultural data that describe the 26 study areas in 14 Western States. The data base contains information on 1,264 sites from which water and bottom sediment were collected. It also contains chemical data from 6,903 analyses of surface water, 914 analyses of ground water, 707 analyses of inorganic constituents in bottom sediments, 223 analyses of organochlorine pesticides in bottom sediments, 8,217 analyses of inorganic constituents in biota, and 1,088 analyses for organic constituents in biota. The data base is available to the public and can be obtained at the NIWQP homepage http://www.usbr.gov/niwqp as dBase III tables for personal-computer systems or as American Standard Code for Information Exchange structured query language (SQL) command and data files for SQL data bases.

  12. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally-efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace, such that the dimensionality of themore » problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2D and a random hydraulic conductivity field in 3D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ~10 1 to ~10 2 in a multi-core computational environment. Furthermore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate- to large-scale problems.« less

  13. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    DOE PAGES

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    2016-09-01

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally-efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace, such that the dimensionality of themore » problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2D and a random hydraulic conductivity field in 3D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ~10 1 to ~10 2 in a multi-core computational environment. Furthermore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate- to large-scale problems.« less

  14. Towards a framework for developing semantic relatedness reference standards.

    PubMed

    Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G

    2011-04-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. An Overview of Preliminary Computational and Experimental Results for the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.

    2011-01-01

    A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.

  16. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS

    PubMed Central

    Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2016-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971

  17. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    PubMed

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  18. Automated technical validation--a real time expert system for decision support.

    PubMed

    de Graeve, J S; Cambus, J P; Gruson, A; Valdiguié, P M

    1996-04-15

    Dealing daily with various machines and various control specimens provides a lot of data that cannot be processed manually. In order to help decision-making we wrote specific software coping with the traditional QC, with patient data (mean of normals, delta check) and with criteria related to the analytical equipment (flags and alarms). Four machines (3 Ektachem 700 and 1 Hitachi 911) analysing 25 common chemical tests are controlled. Every day, three different control specimens and one more once a week (regional survey) are run on the various pieces of equipment. The data are collected on a 486 microcomputer connected to the central computer. For every parameter the standard deviation is compared with the published acceptable limits and the Westgard's rules are computed. The mean of normals is continuously monitored. The final decision induces either an alarm sound and the print-out of the cause of rejection or, if no alarms happen, the daily print-out of recorded data, with or without the Levey Jennings graphs.

  19. Digital 3D Microstructure Analysis of Concrete using X-Ray Micro Computed Tomography SkyScan 1173: A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.

    2017-11-01

    Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.

  20. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  1. Computer Games in Pre-School Settings: Didactical Challenges when Commercial Educational Computer Games Are Implemented in Kindergartens

    ERIC Educational Resources Information Center

    Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune

    2012-01-01

    This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…

  2. Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph

    2015-01-01

    An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.

  3. Analyses of ACPL thermal/fluid conditioning system

    NASA Technical Reports Server (NTRS)

    Stephen, L. A.; Usher, L. H.

    1976-01-01

    Results of engineering analyses are reported. Initial computations were made using a modified control transfer function where the systems performance was characterized parametrically using an analytical model. The analytical model was revised to represent the latest expansion chamber fluid manifold design, and systems performance predictions were made. Parameters which were independently varied in these computations are listed. Systems predictions which were used to characterize performance are primarily transient computer plots comparing the deviation between average chamber temperature and the chamber temperature requirement. Additional computer plots were prepared. Results of parametric computations with the latest fluid manifold design are included.

  4. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    PubMed

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  5. After heat distribution of a mobile nuclear power plant

    NASA Technical Reports Server (NTRS)

    Parker, W. G.; Vanbibber, L. E.; Tang, Y. S.

    1971-01-01

    A computer program was developed to analyze the transient afterheat temperature and pressure response of a mobile gas-cooled reactor power plant following impact. The program considers (in addition to the standard modes of heat transfer) fission product decay and transport, metal-water reactions, core and shield melting and displacement, and pressure and containment vessel stress response. Analyses were performed for eight cases (both deformed and undeformed models) to verify operability of the program options. The results indicated that for a 350 psi (241 n/sq cm) initial internal pressure, the containment vessel can survive over 100,000 seconds following impact before creep rupture occurs. Recommendations were developed as to directions for redesign to extend containment vessel life.

  6. A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty

    NASA Astrophysics Data System (ADS)

    Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl

    2012-05-01

    The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.

  7. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  8. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  9. Computational tools for comparative phenomics; the role and promise of ontologies

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    A major aim of the biological sciences is to gain an understanding of human physiology and disease. One important step towards such a goal is the discovery of the function of genes that will lead to better understanding of the physiology and pathophysiology of organisms ultimately providing better understanding, diagnosis, and therapy. Our increasing ability to phenotypically characterise genetic variants of model organisms coupled with systematic and hypothesis-driven mutagenesis is resulting in a wealth of information that could potentially provide insight to the functions of all genes in an organism. The challenge we are now facing is to develop computational methods that can integrate and analyse such data. The introduction of formal ontologies that make their semantics explicit and accessible to automated reasoning promises the tantalizing possibility of standardizing biomedical knowledge allowing for novel, powerful queries that bridge multiple domains, disciplines, species and levels of granularity. We review recent computational approaches that facilitate the integration of experimental data from model organisms with clinical observations in humans. These methods foster novel cross species analysis approaches, thereby enabling comparative phenomics and leading to the potential of translating basic discoveries from the model systems into diagnostic and therapeutic advances at the clinical level. PMID:22814867

  10. Remote Measurement of Heat Flux from Power Plant Cooling Lakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garrett, Alfred J.; Kurzeja, Robert J.; Villa-Aleman, Eliel

    2013-06-01

    Laboratory experiments have demonstrated a correlation between the rate of heat loss q" from an experimental fluid to the air above and the standard deviation σ of the thermal variability in images of the fluid surface. These experimental results imply that q" can be derived directly from thermal imagery by computing σ. This paper analyses thermal imagery collected over two power plant cooling lakes to determine if the same relationship exists. Turbulent boundary layer theory predicts a linear relationship between q" and σ when both forced (wind driven) and free (buoyancy driven) convection are present. Datasets derived from ground- andmore » helicopter-based imagery collections had correlation coefficients between σ and q" of 0.45 and 0.76, respectively. Values of q" computed from a function of σ and friction velocity u* derived from turbulent boundary layer theory had higher correlations with measured values of q" (0.84 and 0.89). Finally, this research may be applicable to the problem of calculating losses of heat from the ocean to the atmosphere during high-latitude cold-air outbreaks because it does not require the information typically needed to compute sensible, evaporative, and thermal radiation energy losses to the atmosphere.« less

  11. Ultrafast Comparison of Personal Genomes via Precomputed Genome Fingerprints

    PubMed Central

    Glusman, Gustavo; Mauldin, Denise E.; Hood, Leroy E.; Robinson, Max

    2017-01-01

    We present an ultrafast method for comparing personal genomes. We transform the standard genome representation (lists of variants relative to a reference) into “genome fingerprints” via locality sensitive hashing. The resulting genome fingerprints can be meaningfully compared even when the input data were obtained using different sequencing technologies, processed using different pipelines, represented in different data formats and relative to different reference versions. Furthermore, genome fingerprints are robust to up to 30% missing data. Because of their reduced size, computation on the genome fingerprints is fast and requires little memory. For example, we could compute all-against-all pairwise comparisons among the 2504 genomes in the 1000 Genomes data set in 67 s at high quality (21 μs per comparison, on a single processor), and achieved a lower quality approximation in just 11 s. Efficient computation enables scaling up a variety of important genome analyses, including quantifying relatedness, recognizing duplicative sequenced genomes in a set, population reconstruction, and many others. The original genome representation cannot be reconstructed from its fingerprint, effectively decoupling genome comparison from genome interpretation; the method thus has significant implications for privacy-preserving genome analytics. PMID:29018478

  12. Computational domain discretization in numerical analysis of flow within granular materials

    NASA Astrophysics Data System (ADS)

    Sosnowski, Marcin

    2018-06-01

    The discretization of computational domain is a crucial step in Computational Fluid Dynamics (CFD) because it influences not only the numerical stability of the analysed model but also the agreement of obtained results and real data. Modelling flow in packed beds of granular materials is a very challenging task in terms of discretization due to the existence of narrow spaces between spherical granules contacting tangentially in a single point. Standard approach to this issue results in a low quality mesh and unreliable results in consequence. Therefore the common method is to reduce the diameter of the modelled granules in order to eliminate the single-point contact between the individual granules. The drawback of such method is the adulteration of flow and contact heat resistance among others. Therefore an innovative method is proposed in the paper: single-point contact is extended to a cylinder-shaped volume contact. Such approach eliminates the low quality mesh elements and simultaneously introduces only slight distortion to the flow as well as contact heat transfer. The performed analysis of numerous test cases prove the great potential of the proposed method of meshing the packed beds of granular materials.

  13. A new computer program for mass screening of visual defects in preschool children.

    PubMed

    Briscoe, D; Lifshitz, T; Grotman, M; Kushelevsky, A; Vardi, H; Weizman, S; Biedner, B

    1998-04-01

    To test the effectiveness of a PC computer program for detecting vision disorders which could be used by non-trained personnel, and to determine the prevalence of visual impairment in a sample population of preschool children in the city of Beer-Sheba, Israel. 292 preschool children, aged 4-6 years, were examined in the kindergarten setting, using the computer system and "gold standard" tests. Visual acuity and stereopsis were tested and compared using Snellen type symbol charts and random dot stereograms respectively. The sensitivity, specificity, positive predictive value, negative predictive value, and kappa test were evaluated. A computer pseudo Worth four dot test was also performed but could not be compared with the standard Worth four dot test owing to the inability of many children to count. Agreement between computer and gold standard tests was 83% and 97.3% for visual acuity and stereopsis respectively. The sensitivity of the computer stereogram was only 50%, but it had a specificity of 98.9%, whereas the sensitivity and specificity of the visual acuity test were 81.5% and 83% respectively. The positive predictive value of both tests was about 63%. 27.7% of children tested had a visual acuity of 6/12 or less and stereopsis was absent in 28% using standard tests. Impairment of fusion was found in 5% of children using the computer pseudo Worth four dot test. The computer program was found to be stimulating, rapid, and easy to perform. The wide availability of computers in schools and at home allow it to be used as an additional screening tool by non-trained personnel, such as teachers and parents, but it is not a replacement for standard testing.

  14. Robust tuning of robot control systems

    NASA Technical Reports Server (NTRS)

    Minis, I.; Uebel, M.

    1992-01-01

    The computed torque control problem is examined for a robot arm with flexible, geared, joint drive systems which are typical in many industrial robots. The standard computed torque algorithm is not directly applicable to this class of manipulators because of the dynamics introduced by the joint drive system. The proposed approach to computed torque control combines a computed torque algorithm with torque controller at each joint. Three such control schemes are proposed. The first scheme uses the joint torque control system currently implemented on the robot arm and a novel form of the computed torque algorithm. The other two use the standard computed torque algorithm and a novel model following torque control system based on model following techniques. Standard tasks and performance indices are used to evaluate the performance of the controllers. Both numerical simulations and experiments are used in evaluation. The study shows that all three proposed systems lead to improved tracking performance over a conventional PD controller.

  15. States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17

    ERIC Educational Resources Information Center

    Tilley-Coulson, Eve

    2016-01-01

    While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…

  16. Analytic and heuristic processing influences on adolescent reasoning and decision-making.

    PubMed

    Klaczynski, P A

    2001-01-01

    The normative/descriptive gap is the discrepancy between actual reasoning and traditional standards for reasoning. The relationship between age and the normative/descriptive gap was examined by presenting adolescents with a battery of reasoning and decision-making tasks. Middle adolescents (N = 76) performed closer to normative ideals than early adolescents (N = 66), although the normative/descriptive gap was large for both groups. Correlational analyses revealed that (1) normative responses correlated positively with each other, (2) nonnormative responses were positively interrelated, and (3) normative and nonnormative responses were largely independent. Factor analyses suggested that performance was based on two processing systems. The "analytic" system operates on "decontextualized" task representations and underlies conscious, computational reasoning. The "heuristic" system operates on "contextualized," content-laden representations and produces "cognitively cheap" responses that sometimes conflict with traditional norms. Analytic processing was more clearly linked to age and to intelligence than heuristic processing. Implications for cognitive development, the competence/performance issue, and rationality are discussed.

  17. A marked correlation function for constraining modified gravity models

    NASA Astrophysics Data System (ADS)

    White, Martin

    2016-11-01

    Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.

  18. Methods for Specifying Scientific Data Standards and Modeling Relationships with Applications to Neuroscience

    PubMed Central

    Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer

    2016-01-01

    Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355

  19. Comparison of myocardial perfusion imaging between the new high-speed gamma camera and the standard anger camera.

    PubMed

    Tanaka, Hirokazu; Chikamori, Taishiro; Hida, Satoshi; Uchida, Kenji; Igarashi, Yuko; Yokoyama, Tsuyoshi; Takahashi, Masaki; Shiba, Chie; Yoshimura, Mana; Tokuuye, Koichi; Yamashina, Akira

    2013-01-01

    Cadmium-zinc-telluride (CZT) solid-state detectors have been recently introduced into the field of myocardial perfusion imaging. The aim of this study was to prospectively compare the diagnostic performance of the CZT high-speed gamma camera (Discovery NM 530c) with that of the standard 3-head gamma camera in the same group of patients. The study group consisted of 150 consecutive patients who underwent a 1-day stress-rest (99m)Tc-sestamibi or tetrofosmin imaging protocol. Image acquisition was performed first on a standard gamma camera with a 15-min scan time each for stress and for rest. All scans were immediately repeated on a CZT camera with a 5-min scan time for stress and a 3-min scan time for rest, using list mode. The correlations between the CZT camera and the standard camera for perfusion and function analyses were strong within narrow Bland-Altman limits of agreement. Using list mode analysis, image quality for stress was rated as good or excellent in 97% of the 3-min scans, and in 100% of the ≥4-min scans. For CZT scans at rest, similarly, image quality was rated as good or excellent in 94% of the 1-min scans, and in 100% of the ≥2-min scans. The novel CZT camera provides excellent image quality, which is equivalent to standard myocardial single-photon emission computed tomography, despite a short scan time of less than half of the standard time.

  20. Eigensolver for a Sparse, Large Hermitian Matrix

    NASA Technical Reports Server (NTRS)

    Tisdale, E. Robert; Oyafuso, Fabiano; Klimeck, Gerhard; Brown, R. Chris

    2003-01-01

    A parallel-processing computer program finds a few eigenvalues in a sparse Hermitian matrix that contains as many as 100 million diagonal elements. This program finds the eigenvalues faster, using less memory, than do other, comparable eigensolver programs. This program implements a Lanczos algorithm in the American National Standards Institute/ International Organization for Standardization (ANSI/ISO) C computing language, using the Message Passing Interface (MPI) standard to complement an eigensolver in PARPACK. [PARPACK (Parallel Arnoldi Package) is an extension, to parallel-processing computer architectures, of ARPACK (Arnoldi Package), which is a collection of Fortran 77 subroutines that solve large-scale eigenvalue problems.] The eigensolver runs on Beowulf clusters of computers at the Jet Propulsion Laboratory (JPL).

  1. Computational fluid dynamics analysis in support of the simplex turbopump design

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa W.; Benjamin, Theodore G.; Cornelison, Joni W.; Ruf, Joseph H.; Williams, Robert W.

    1994-01-01

    Simplex is a turbopump that is being developed at NASA/Marshall Space Flight Center (MSFC) by an in-house team. The turbopump consists of a single-stage centrifugal impeller, vaned-diffuser pump powered by a single-stage, axial, supersonic, partial admission turbine. The turbine is driven by warm gaseous oxygen tapped off of the hybrid motor to which it will be coupled. Rolling element bearings are cooled by the pumping fluid. Details of the configuration and operating conditions are given by Marsh. CFD has been used extensively to verify one-dimensional (1D) predictions, assess aerodynamic and hydrodynamic designs, and to provide flow environments. The complete primary flow path of the pump-end and the hot gas path of the turbine, excluding the inlet torus, have been analyzed. All CFD analyses conducted for the Simplex turbopump employed the pressure based Finite Difference Navier-Stokes (FDNS) code using a standard kappa-epsilon turbulence model with wall functions. More detailed results are presented by Garcia et. al. To support the team, loading and temperature results for the turbine rotor were provided as inputs to structural and thermal analyses, and blade loadings from the inducer were provided for structural analyses.

  2. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    PubMed

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Growth characterisation of intra-thoracic organs of children on CT scans.

    PubMed

    Coulongeat, François; Jarrar, Mohamed-Salah; Thollon, Lionel; Serre, Thierry

    2013-01-01

    This paper analyses the geometry of intra-thoracic organs from computed tomography (CT) scans performed on 20 children aged from 4 months to 16 years. The aim is to find the most reliable measurements to characterise the growth of heart and lungs from CT data. Standard measurements available on chest radiographies are compared with original measurements only available on CT scans. These measurements should characterise the growth of organs as well as the changes in their position relative to the thorax. Measurements were considered as functions of age. Quadratic regression models were fitted to the data. Goodness of fit of the models was then evaluated. Positions of organs relative to the thorax have a high variability compared with their changes with age. The length and volume of the heart and lungs as well as the diameter of the thorax fit well to the models of growth. It could be interesting to study these measurements with a larger sample size in order to define growth standards.

  4. Cohort mortality study of garment industry workers exposed to formaldehyde: update and internal comparisons.

    PubMed

    Meyers, Alysha R; Pinkerton, Lynne E; Hein, Misty J

    2013-09-01

    To further evaluate the association between formaldehyde and leukemia, we extended follow-up through 2008 for a cohort mortality study of 11,043 US formaldehyde-exposed garment workers. We computed standardized mortality ratios and standardized rate ratios stratified by year of first exposure, exposure duration, and time since first exposure. Associations between exposure duration and rates of leukemia and myeloid leukemia were further examined using Poisson regression models. Compared to the US population, myeloid leukemia mortality was elevated but overall leukemia mortality was not. In internal analyses, overall leukemia mortality increased with increasing exposure duration and this trend was statistically significant. We continue to see limited evidence of an association between formaldehyde and leukemia. However, the extended follow-up did not strengthen previously observed associations. In addition to continued epidemiologic research, we recommend further research to evaluate the biological plausibility of a causal relation between formaldehyde and leukemia. Copyright © 2013 Wiley Periodicals, Inc.

  5. Generation of a head phantom according to the 95th percentile Chinese population data for evaluating the specific absorption rate by wireless communication devices.

    PubMed

    Ma, Yu; Wang, Yuduo; Shao, Qing; Li, Congsheng; Wu, Tongning

    2014-03-01

    A Chinese head phantom (CHP) is constructed for evaluating the specific absorption rate (SAR) by the wireless transceivers. The dimensions of the head phantom are within 4 % difference compared with the 95th percentile data from the China's standard. The shell's thickness and the configuration of the pinna are the same as those of the specific anthropomorphic mannequin (SAM). Three computable models for the mobile phones are generated and used in the SAR simulations with the SAM and the CHP. The results show that the simulated SAR from the SAM head is similar. Its morphological reason has been analysed. The authors discuss the conservativeness of the two head phantoms as well. The CHP can be used in the inter-laboratory evaluation for the SAR uncertainty. It can also provide the information for the SAR variability due to physical difference, which will benefit the maintenance and the harmonisation of the standards.

  6. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    NASA Technical Reports Server (NTRS)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  8. Portable Computer Keyboard For Use With One Hand

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L.

    1992-01-01

    Data-entry device held in one hand and operated with five fingers. Contains seven keys. Letters, numbers, punctuation, and cursor commands keyed into computer by pressing keys in various combinations. Device called "data egg" used where standard typewriter keyboard unusable or unavailable. Contains micro-processor and 32-Kbyte memory. Captures text and transmits it to computer. Concept extended to computer mouse. Especially useful to handicapped or bedridden people who find it difficult or impossible to operate standard keyboards.

  9. Influence of Finite Element Software on Energy Release Rates Computed Using the Virtual Crack Closure Technique

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Goetze, Dirk; Ransom, Jonathon (Technical Monitor)

    2006-01-01

    Strain energy release rates were computed along straight delamination fronts of Double Cantilever Beam, End-Notched Flexure and Single Leg Bending specimens using the Virtual Crack Closure Technique (VCCT). Th e results were based on finite element analyses using ABAQUS# and ANSYS# and were calculated from the finite element results using the same post-processing routine to assure a consistent procedure. Mixed-mode strain energy release rates obtained from post-processing finite elem ent results were in good agreement for all element types used and all specimens modeled. Compared to previous studies, the models made of s olid twenty-node hexahedral elements and solid eight-node incompatible mode elements yielded excellent results. For both codes, models made of standard brick elements and elements with reduced integration did not correctly capture the distribution of the energy release rate acr oss the width of the specimens for the models chosen. The results suggested that element types with similar formulation yield matching results independent of the finite element software used. For comparison, m ixed-mode strain energy release rates were also calculated within ABAQUS#/Standard using the VCCT for ABAQUS# add on. For all specimens mod eled, mixed-mode strain energy release rates obtained from ABAQUS# finite element results using post-processing were almost identical to re sults calculated using the VCCT for ABAQUS# add on.

  10. Materials constitutive models for nonlinear analysis of thermally cycled structures

    NASA Technical Reports Server (NTRS)

    Kaufman, A.; Hunt, L. E.

    1982-01-01

    Effects of inelastic materials models on computed stress-strain solutions for thermally loaded structures were studied by performing nonlinear (elastoplastic creep) and elastic structural analyses on a prismatic, double edge wedge specimen of IN 100 alloy that was subjected to thermal cycling in fluidized beds. Four incremental plasticity creep models (isotropic, kinematic, combined isotropic kinematic, and combined plus transient creep) were exercised for the problem by using the MARC nonlinear, finite element computer program. Maximum total strain ranges computed from the elastic and nonlinear analyses agreed within 5 percent. Mean cyclic stresses, inelastic strain ranges, and inelastic work were significantly affected by the choice of inelastic constitutive model. The computing time per cycle for the nonlinear analyses was more than five times that required for the elastic analysis.

  11. A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.

    PubMed

    Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K

    2008-09-10

    Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.

  12. Rapid Global Fitting of Large Fluorescence Lifetime Imaging Microscopy Datasets

    PubMed Central

    Warren, Sean C.; Margineanu, Anca; Alibhai, Dominic; Kelly, Douglas J.; Talbot, Clifford; Alexandrov, Yuriy; Munro, Ian; Katan, Matilda

    2013-01-01

    Fluorescence lifetime imaging (FLIM) is widely applied to obtain quantitative information from fluorescence signals, particularly using Förster Resonant Energy Transfer (FRET) measurements to map, for example, protein-protein interactions. Extracting FRET efficiencies or population fractions typically entails fitting data to complex fluorescence decay models but such experiments are frequently photon constrained, particularly for live cell or in vivo imaging, and this leads to unacceptable errors when analysing data on a pixel-wise basis. Lifetimes and population fractions may, however, be more robustly extracted using global analysis to simultaneously fit the fluorescence decay data of all pixels in an image or dataset to a multi-exponential model under the assumption that the lifetime components are invariant across the image (dataset). This approach is often considered to be prohibitively slow and/or computationally expensive but we present here a computationally efficient global analysis algorithm for the analysis of time-correlated single photon counting (TCSPC) or time-gated FLIM data based on variable projection. It makes efficient use of both computer processor and memory resources, requiring less than a minute to analyse time series and multiwell plate datasets with hundreds of FLIM images on standard personal computers. This lifetime analysis takes account of repetitive excitation, including fluorescence photons excited by earlier pulses contributing to the fit, and is able to accommodate time-varying backgrounds and instrument response functions. We demonstrate that this global approach allows us to readily fit time-resolved fluorescence data to complex models including a four-exponential model of a FRET system, for which the FRET efficiencies of the two species of a bi-exponential donor are linked, and polarisation-resolved lifetime data, where a fluorescence intensity and bi-exponential anisotropy decay model is applied to the analysis of live cell homo-FRET data. A software package implementing this algorithm, FLIMfit, is available under an open source licence through the Open Microscopy Environment. PMID:23940626

  13. An efficient pseudomedian filter for tiling microrrays.

    PubMed

    Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B

    2007-06-07

    Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at http://tiling.gersteinlab.org/pseudomedian/.

  14. An efficient pseudomedian filter for tiling microrrays

    PubMed Central

    Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B

    2007-01-01

    Background Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. Results We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Conclusion Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at . PMID:17555595

  15. The role of family-related factors in the effects of the UP4FUN school-based family-focused intervention targeting screen time in 10- to 12-year-old children: the ENERGY project.

    PubMed

    Van Lippevelde, Wendy; Bere, Elling; Verloigne, Maïté; van Stralen, Maartje M; De Bourdeaudhuij, Ilse; Lien, Nanna; Vik, Frøydis Nordgård; Manios, Yannis; Grillenberger, Monika; Kovács, Eva; ChinAPaw, Mai J M; Brug, Johannes; Maes, Lea

    2014-08-18

    Screen-related behaviours are highly prevalent in schoolchildren. Considering the adverse health effects and the relation of obesity and screen time in childhood, efforts to affect screen use in children are warranted. Parents have been identified as an important influence on children's screen time and therefore should be involved in prevention programmes. The aim was to examine the mediating role of family-related factors on the effects of the school-based family-focused UP4FUN intervention aimed at screen time in 10- to 12-year-old European children (n child-parent dyads = 1940). A randomised controlled trial was conducted to test the six-week UP4FUN intervention in 10- to 12-year-old children and one of their parents in five European countries in 2011 (n child-parent dyads = 1940). Self-reported data of children were used to assess their TV and computer/game console time per day, and parents reported their physical activity, screen time and family-related factors associated with screen behaviours (availability, permissiveness, monitoring, negotiation, rules, avoiding negative role modeling, and frequency of physically active family excursions). Mediation analyses were performed using multi-level regression analyses (child-school-country). Almost all TV-specific and half of the computer-specific family-related factors were associated with children's screen time. However, the measured family-related factors did not mediate intervention effects on children's TV and computer/game console use, because the intervention was not successful in changing these family-related factors. Future screen-related interventions should aim to effectively target the home environment and parents' practices related to children's use of TV and computers to decrease children's screen time. The study is registered in the International Standard Randomised Controlled Trial Number Register (registration number: ISRCTN34562078).

  16. Television viewing, computer game play and book reading during meals are predictors of meal skipping in a cross-sectional sample of 12-, 14- and 16-year-olds.

    PubMed

    Custers, Kathleen; Van den Bulck, Jan

    2010-04-01

    To examine whether television viewing, computer game playing or book reading during meals predicts meal skipping with the aim of watching television, playing computer games or reading books (media meal skipping). A cross-sectional study was conducted using a standardized self-administered questionnaire. Analyses were controlled for age, gender and BMI. Data were obtained from a random sample of adolescents in Flanders, Belgium. Seven hundred and ten participants aged 12, 14 and 16 years. Of the participants, 11.8 % skipped meals to watch television, 10.5 % skipped meals to play computer games and 8.2 % skipped meals to read books. Compared with those who did not use these media during meals, the risk of skipping meals in order to watch television was significantly higher for those children who watched television during meals (2.9 times higher in those who watched television during at least one meal a day). The risk of skipping meals for computer game playing was 9.5 times higher in those who played computer games weekly or more while eating, and the risk of meal skipping in order to read books was 22.9 times higher in those who read books during meals less than weekly. The more meals the respondents ate with the entire family, the less likely they were to skip meals to watch television. The use of media during meals predicts meal skipping for using that same medium. Family meals appear to be inversely related to meal skipping for television viewing.

  17. Computer-Delivered and Web-Based Interventions to Improve Depression, Anxiety, and Psychological Well-Being of University Students: A Systematic Review and Meta-Analysis

    PubMed Central

    Morriss, Richard; Glazebrook, Cris

    2014-01-01

    Background Depression and anxiety are common mental health difficulties experienced by university students and can impair academic and social functioning. Students are limited in seeking help from professionals. As university students are highly connected to digital technologies, Web-based and computer-delivered interventions could be used to improve students’ mental health. The effectiveness of these intervention types requires investigation to identify whether these are viable prevention strategies for university students. Objective The intent of the study was to systematically review and analyze trials of Web-based and computer-delivered interventions to improve depression, anxiety, psychological distress, and stress in university students. Methods Several databases were searched using keywords relating to higher education students, mental health, and eHealth interventions. The eligibility criteria for studies included in the review were: (1) the study aimed to improve symptoms relating to depression, anxiety, psychological distress, and stress, (2) the study involved computer-delivered or Web-based interventions accessed via computer, laptop, or tablet, (3) the study was a randomized controlled trial, and (4) the study was trialed on higher education students. Trials were reviewed and outcome data analyzed through random effects meta-analyses for each outcome and each type of trial arm comparison. Cochrane Collaboration risk of bias tool was used to assess study quality. Results A total of 17 trials were identified, in which seven were the same three interventions on separate samples; 14 reported sufficient information for meta-analysis. The majority (n=13) were website-delivered and nine interventions were based on cognitive behavioral therapy (CBT). A total of 1795 participants were randomized and 1480 analyzed. Risk of bias was considered moderate, as many publications did not sufficiently report their methods and seven explicitly conducted completers’ analyses. In comparison to the inactive control, sensitivity meta-analyses supported intervention in improving anxiety (pooled standardized mean difference [SMD] −0.56; 95% CI −0.77 to −0.35, P<.001), depression (pooled SMD −0.43; 95% CI −0.63 to −0.22, P<.001), and stress (pooled SMD −0.73; 95% CI −1.27 to −0.19, P=.008). In comparison to active controls, sensitivity analyses did not support either condition for anxiety (pooled SMD −0.18; 95% CI −0.98 to 0.62, P=.66) or depression (pooled SMD −0.28; 95% CI −0.75 to −0.20, P=.25). In contrast to a comparison intervention, neither condition was supported in sensitivity analyses for anxiety (pooled SMD −0.10; 95% CI −0.39 to 0.18, P=.48) or depression (pooled SMD −0.33; 95% CI −0.43 to 1.09, P=.40). Conclusions The findings suggest Web-based and computer-delivered interventions can be effective in improving students’ depression, anxiety, and stress outcomes when compared to inactive controls, but some caution is needed when compared to other trial arms and methodological issues were noticeable. Interventions need to be trialed on more heterogeneous student samples and would benefit from user evaluation. Future trials should address methodological considerations to improve reporting of trial quality and address post-intervention skewed data. PMID:24836465

  18. Cybersecurity Workforce Development and the Protection of Critical Infrastructure

    DTIC Science & Technology

    2017-03-31

    communicat ions products, and limited travel for site visits and conferencing. The CSCC contains a developed web-based coordination site, computer ...the CSCC. The Best Practices Ana~yst position maintains a lisr of best practices, computer related patches. and standard operating procedures (SOP...involved in conducting vulnerability assessments of computer networks. To adequately exercise and experiment with industry standard software, it was

  19. Precision of quantitative computed tomography texture analysis using image filtering: A phantom study for scanner variability.

    PubMed

    Yasaka, Koichiro; Akai, Hiroyuki; Mackin, Dennis; Court, Laurence; Moros, Eduardo; Ohtomo, Kuni; Kiryu, Shigeru

    2017-05-01

    Quantitative computed tomography (CT) texture analyses for images with and without filtration are gaining attention to capture the heterogeneity of tumors. The aim of this study was to investigate how quantitative texture parameters using image filtering vary among different computed tomography (CT) scanners using a phantom developed for radiomics studies.A phantom, consisting of 10 different cartridges with various textures, was scanned under 6 different scanning protocols using four CT scanners from four different vendors. CT texture analyses were performed for both unfiltered images and filtered images (using a Laplacian of Gaussian spatial band-pass filter) featuring fine, medium, and coarse textures. Forty-five regions of interest were placed for each cartridge (x) in a specific scan image set (y), and the average of the texture values (T(x,y)) was calculated. The interquartile range (IQR) of T(x,y) among the 6 scans was calculated for a specific cartridge (IQR(x)), while the IQR of T(x,y) among the 10 cartridges was calculated for a specific scan (IQR(y)), and the median IQR(y) was then calculated for the 6 scans (as the control IQR, IQRc). The median of their quotient (IQR(x)/IQRc) among the 10 cartridges was defined as the variability index (VI).The VI was relatively small for the mean in unfiltered images (0.011) and for standard deviation (0.020-0.044) and entropy (0.040-0.044) in filtered images. Skewness and kurtosis in filtered images featuring medium and coarse textures were relatively variable across different CT scanners, with VIs of 0.638-0.692 and 0.430-0.437, respectively.Various quantitative CT texture parameters are robust and variable among different scanners, and the behavior of these parameters should be taken into consideration.

  20. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs

    PubMed Central

    Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  1. How to make your own response boxes: A step-by-step guide for the construction of reliable and inexpensive parallel-port response pads from computer mice.

    PubMed

    Voss, Andreas; Leonhart, Rainer; Stahl, Christoph

    2007-11-01

    Psychological research is based in large parts on response latencies, which are often registered by keypresses on a standard computer keyboard. Recording response latencies with a standard keyboard is problematic because keypresses are buffered within the keyboard hardware before they are signaled to the computer, adding error variance to the recorded latencies. This can be circumvented by using external response pads connected to the computer's parallel port. In this article, we describe how to build inexpensive, reliable, and easy-to-use response pads with six keys from two standard computer mice that can be connected to the PC's parallel port. We also address the problem of recording data from the parallel port with different software packages under Microsoft's Windows XP.

  2. A Status Review of the Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) Project

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Funk, Christy; Keller, Donald F.; Ringertz, Ulf

    2016-01-01

    An overview of recent progress regarding the computational aeroelastic and aeroservoelastic (ASE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed to date with a focus on unstructured CFD grids, computational aeroelastic analyses, sonic boom propagation studies that include static aeroelastic effects, and gust loads analyses. In addition, flutter boundaries using aeroelastic Reduced-Order Models (ROMs) are presented at various Mach numbers of interest. Details regarding a collaboration with the Royal Institute of Technology (KTH, Stockholm, Sweden) to design, fabricate, and test a full-span aeroelastic wind-tunnel model are also presented.

  3. Diagnostic Performance of (18)F-Fluorodeoxyglucose in 162 Small Pulmonary Nodules Incidentally Detected in Subjects Without a History of Malignancy.

    PubMed

    Calcagni, Maria Lucia; Taralli, Silvia; Cardillo, Giuseppe; Graziano, Paolo; Ialongo, Pasquale; Mattoli, Maria Vittoria; Di Franco, Davide; Caldarella, Carmelo; Carleo, Francesco; Indovina, Luca; Giordano, Alessandro

    2016-04-01

    Solitary pulmonary nodule (SPN) still represents a diagnostic challenge. The aim of our study was to evaluate the diagnostic performance of (18)F-fluorodeoxyglucose positron emission tomography-computed tomography in one of the largest samples of small SPNs, incidentally detected in subjects without a history of malignancy (nonscreening population) and undetermined at computed tomography. One-hundred and sixty-two small (>0.8 to 1.5 cm) and, for comparison, 206 large nodules (>1.5 to 3 cm) were retrospectively evaluated. Diagnostic performance of (18)F-fluorodeoxyglucose visual analysis, receiver-operating characteristic (ROC) analysis for maximum standardized uptake value (SUVmax), and Bayesian analysis were assessed using histology or radiological follow-up as a golden standard. In 162 small nodules, (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.3) provided 72.6% and 77.4% sensitivity and 88.0% and 82.0% specificity, respectively. The prevalence of malignancy was 38%; Bayesian analysis provided 78.8% positive and 16.0% negative posttest probabilities of malignancy. In 206 large nodules (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.9) provided 89.5% and 85.1% sensitivity and 70.8% and 79.2% specificity, respectively. The prevalence of malignancy was 65%; Bayesian analysis provided 85.0% positive and 21.6% negative posttest probabilities of malignancy. In both groups, malignant nodules had a significant higher SUVmax (p < 0.0001) than benign nodules. Only in the small group, malignant nodules were significantly larger (p = 0.0054) than benign ones. (18)F-fluorodeoxyglucose can be clinically relevant to rule in and rule out malignancy in undetermined small SPNs, incidentally detected in nonscreening population with intermediate pretest probability of malignancy, as well as in larger ones. Visual analysis can be considered an optimal diagnostic criterion, adequately detecting a wide range of malignant nodules with different metabolic activity. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  4. Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction

    PubMed Central

    Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442

  5. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.

  6. Matrix Summaries Improve Research Reports: Secondary Analyses Using Published Literature

    ERIC Educational Resources Information Center

    Zientek, Linda Reichwein; Thompson, Bruce

    2009-01-01

    Correlation matrices and standard deviations are the building blocks of many of the commonly conducted analyses in published research, and AERA and APA reporting standards recommend their inclusion when reporting research results. The authors argue that the inclusion of correlation/covariance matrices, standard deviations, and means can enhance…

  7. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  8. The validity and intra-tester reliability of markerless motion capture to analyse kinematics of the BMX Supercross gate start.

    PubMed

    Grigg, Josephine; Haakonssen, Eric; Rathbone, Evelyne; Orr, Robin; Keogh, Justin W L

    2017-11-13

    The aim of this study was to quantify the validity and intra-tester reliability of a novel method of kinematic measurement. The measurement target was the joint angles of an athlete performing a BMX Supercross (SX) gate start action through the first 1.2 s of movement in situ on a BMX SX ramp using a standard gate start procedure. The method employed GoPro® Hero 4 Silver (GoPro Inc., USA) cameras capturing data at 120 fps 720 p on a 'normal' lens setting. Kinovea 0.8.15 (Kinovea.org, France) was used for analysis. Tracking data was exported and angles computed in Matlab (Mathworks®, USA). The gold standard 3D method for joint angle measurement could not safely be employed in this environment, so a rigid angle was used. Validity was measured to be within 2°. Intra-tester reliability was measured by the same tester performing the analysis twice with an average of 55 days between analyses. Intra-tester reliability was high, with an absolute error <6° and <9 frames (0.075 s) across all angles and time points for key positions, respectively. The methodology is valid within 2° and reliable within 6° for the calculation of joint angles in the first ~1.25 s.

  9. Software agents for the dissemination of remote terrestrial sensing data

    NASA Technical Reports Server (NTRS)

    Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.

    1994-01-01

    Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation allows agents to advertize their capabilities and results to other agents, thereby allowing the collection of agents to reuse each others work.

  10. Minimum information about a single amplified genome (MISAG) and a metagenome-assembled genome (MIMAG) of bacteria and archaea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowers, Robert M.; Kyrpides, Nikos C.; Stepanauskas, Ramunas

    The number of genomes from uncultivated microbes will soon surpass the number of isolate genomes in public databases (Hugenholtz, Skarshewski, & Parks, 2016). Technological advancements in high-throughput sequencing and assembly, including single-cell genomics and the computational extraction of genomes from metagenomes (GFMs), are largely responsible. Here we propose community standards for reporting the Minimum Information about a Single-Cell Genome (MIxS-SCG) and Minimum Information about Genomes extracted From Metagenomes (MIxS-GFM) specific for Bacteria and Archaea. The standards have been developed in the context of the International Genomics Standards Consortium (GSC) community (Field et al., 2014) and can be viewed as amore » supplement to other GSC checklists including the Minimum Information about a Genome Sequence (MIGS), Minimum information about a Metagenomic Sequence(s) (MIMS) (Field et al., 2008) and Minimum Information about a Marker Gene Sequence (MIMARKS) (P. Yilmaz et al., 2011). Community-wide acceptance of MIxS-SCG and MIxS-GFM for Bacteria and Archaea will enable broad comparative analyses of genomes from the majority of taxa that remain uncultivated, improving our understanding of microbial function, ecology, and evolution.« less

  11. High School Class for Gifted Pupils in Physics and Sciences and Pupils' Skills Measured by Standard and Pisa Test

    NASA Astrophysics Data System (ADS)

    Djordjevic, G. S.; Pavlovic-Babic, D.

    2010-01-01

    The "High school class for students with special abilities in physics" was founded in Nis, Serbia (www.pmf.ni.ac.yu/f_odeljenje) in 2003. The basic aim of this project has been introducing a broadened curriculum of physics, mathematics, computer science, as well as chemistry and biology. Now, six years after establishing of this specialized class, and 3 years after the previous report, we present analyses of the pupils' skills in solving rather problem oriented test, as PISA test, and compare their results with the results of pupils who study under standard curricula. More precisely results are compared to the progress results of the pupils in a standard Grammar School and the corresponding classes of the Mathematical Gymnasiums in Nis. Analysis of achievement data should clarify what are benefits of introducing in school system track for gifted students. Additionally, item analysis helps in understanding and improvement of learning strategies' efficacy. We make some conclusions and remarks that may be useful for the future work that aims to increase pupils' intrinsic and instrumental motivation for physics and sciences, as well as to increase the efficacy of teaching physics and science.

  12. Demagnetization Analysis in Excel (DAIE) - An open source workbook in Excel for viewing and analyzing demagnetization data from paleomagnetic discrete samples and u-channels

    NASA Astrophysics Data System (ADS)

    Sagnotti, Leonardo

    2013-04-01

    Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).

  13. Standard interface: Twin-coaxial converter

    NASA Technical Reports Server (NTRS)

    Lushbaugh, W. A.

    1976-01-01

    The network operations control center standard interface has been adopted as a standard computer interface for all future minicomputer based subsystem development for the Deep Space Network. Discussed is an intercomputer communications link using a pair of coaxial cables. This unit is capable of transmitting and receiving digital information at distances up to 600 m with complete ground isolation between the communicating devices. A converter is described that allows a computer equipped with the standard interface to use the twin coaxial link.

  14. Quantification of experimental venous thrombus resolution by longitudinal nanogold-enhanced micro-computed tomography.

    PubMed

    Grover, Steven P; Saha, Prakash; Jenkins, Julia; Mukkavilli, Arun; Lyons, Oliver T; Patel, Ashish S; Sunassee, Kavitha; Modarai, Bijan; Smith, Alberto

    2015-12-01

    The assessment of thrombus size following treatments directed at preventing thrombosis or enhancing its resolution has generally relied on physical or histological methods. This cross-sectional design imposes the need for increased numbers of animals for experiments. Micro-computed tomography (microCT) has been used to detect the presence of venous thrombus in experimental models but has yet to be used in a quantitative manner. In this study, we investigate the use of contrast-enhanced microCT for the longitudinal assessment of experimental venous thrombus resolution. Thrombi induced by stenosis of the inferior vena cava in mice were imaged by contrast-enhanced microCT at 1, 7 and 14 days post-induction (n=18). Thrombus volumes were determined longitudinally by segmentation and 3D volume reconstruction of microCT scans and by standard end-point histological analysis at day 14. An additional group of thrombi were analysed solely by histology at 1, 7 and 14 days post-induction (n=15). IVC resident thrombus was readily detectable by contrast-enhanced microCT. MicroCT-derived measurements of thrombus volume correlated well with time-matched histological analyses (ICC=0.75, P<0.01). Thrombus volumes measured by microCT were significantly greater than those derived from histological analysis (P<0.001). Intra- and inter-observer analyses were highly correlated (ICC=0.99 and 0.91 respectively, P<0.0001). Further histological analysis revealed noticeable levels of contrast agent extravasation into the thrombus that was associated with the presence of neovascular channels, macrophages and intracellular iron deposits. Contrast-enhanced microCT represents a reliable and reproducible method for the longitudinal assessment of venous thrombus resolution providing powerful paired data. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Study between anb angle and wits appraisal in cone beam computed tomography (cbct)

    PubMed Central

    Cibrián, Rosa; Gandia, Jose L.; Paredes, Vanessa

    2013-01-01

    Objectives: To analyse the ANB and Wits values and to study correlations between those two measurements and other measurements in diagnosing the anteroposterior maxilo-mandibular relationship with CBCT. Study Design: Ninety patients who had previously a CBCT (i-CAT®) as a diagnostic register were selected. A 3D cephalometry was designed using one software package, InVivo5®. This cephalometry included 3 planes of reference, 3 angle measurements and 1 linear measurement. The means and standard deviations of the mean of each measurement were assessed. After that, a Pearson´s correlation coefficient has been performed to analyse the significance of each relationship. Results: When classifying the sample according to the anteroposterior relationship, the values obtained of ANB (Class I: 53%; Class II: 37%; Class III: 10%) and Wits (Class I: 35%; Class II: 56%; Class III: 9%) did not coincide, except for the Class III group. However, of the patients classified differently (Class I and Class II patients) by ANB and Wits, a high percentage of individuals (n=22; 49%), had a mesofacial pattern with a mandibular plane angle within normal values. A correlation has been found between ANB and Wits (r=0,262), occlusal plane angle and ANB (r=0,426), and mandibular plane angle and Wits (r=0,242). No correlation was found between either Wits or ANB in relation with the age of the individuals. Conclusions: ANB and Wits must be included in 3D cephalometric analyses as both are necessary to undertake a more accurate diagnosis of the maxillo-mandibular relationship of the patients. Key words:Cone beam computed tomography, ANB, Wits, cephalometrics. PMID:23722136

  16. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    NASA Technical Reports Server (NTRS)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  17. Principal component analysis in construction of 3D human knee joint models using a statistical shape model method.

    PubMed

    Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan

    2015-01-01

    The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the three-dimensional (3D) joint surface model has been reported in the literature. In this study, we constructed a SSM database using 152 human computed tomography (CT) knee joint models, including the femur, tibia and patella and analysed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 s using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus, it may have a broad application in computer-assisted knee surgeries that require 3D surface models of the knee.

  18. Development of Automatic Control of Bayer Plant Digestion

    NASA Astrophysics Data System (ADS)

    Riffaud, J. P.

    Supervisory computer control has been achieved in Alcan's Bayer Plants at Arvida, Quebec, Canada. The purpose of the automatic control system is to stabilize and consequently increase, the alumina/caustic ratio within the digester train and in the blow-off liquor. Measurements of the electrical conductivity of the liquor are obtained from electrodeless conductivity meters. These signals, along with several others are scanned by the computer and converted to engineering units, using specific relationships which are updated periodically for calibration purposes. On regular time intervals, values of ratio are compared to target values and adjustments are made to the bauxite flow entering the digesters. Dead time compensation included in the control algorithm enables a faster rate for corrections. Modification of production rate is achieved through careful timing of various flow changes. Calibration of the conductivity meters is achieved by sampling at intervals the liquor flowing through them, and analysing it with a thermometric titrator. Calibration of the thermometric titrator is done at intervals with a standard solution. Calculations for both calibrations are performed by computer from data entered by the analyst. The computer was used for on-line data collection, modelling of the digester system, calculation of disturbances and simulation of control strategies before implementing the most successful strategy in the Plant. Control of ratio has been improved by the integrated system, resulting in increased Plant productivity.

  19. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    PubMed

    Camargo, Anyela; Papadopoulou, Dimitra; Spyropoulou, Zoi; Vlachonasios, Konstantinos; Doonan, John H; Gay, Alan P

    2014-01-01

    Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harriman, D.A.; Sargent, B.P.

    Groundwater quality was evaluated in seven confined aquifers and the water table aquifer in east-central New Jersey based on 237 analyses of samples collected in 1981-82, and 225 older analyses. Investigation of the effect of land use on water quality and several sampling network proposals for the region are reported. Iron (Fe) and manganese (Mn) concentrations exceed US EPA drinking water standards in some wells screened in the Potomac-Raritan-Magothy aquifer system. Sodium (Na) concentrations in samples from three wells more than 800 ft deep in the Englishtown aquifer exceed the standard. Iron and Mn concentrations in this aquifer may alsomore » exceed the standards. Iron concentrations in the Wenonah-Mount Laurel aquifer exceed the standard. Based on 15 analyses of water from the Vincetown aquifer, Mn is the only constituent that exceeds the drinking water standard. In the Manasquan aquifer, 4 of the 16 Na determinations exceed the standard, and 8 of 16 Fe determinations exceed the standard. Water quality in the Atlantic City 800-ft sand is generally satisfactory. However, 12 Fe and 1 of 12 Mn determinations exceed the standards. For the Rio Grande water-bearing zone, 1 of 3 Fe determinations exceed the standard. The Kirkwood-Cohansey aquifer system was the most thoroughly sampled (249 chemical analyses from 209 wells). Dissolved solids, chloride, Fe, nitrate, and Mn concentrations exceed drinking water standards in some areas. 76 refs., 36 figs., 12 tabs.« less

  1. Method of estimating flood-frequency parameters for streams in Idaho

    USGS Publications Warehouse

    Kjelstrom, L.C.; Moffatt, R.L.

    1981-01-01

    Skew coefficients for the log-Pearson type III distribution are generalized on the basis of some similarity of floods in the Snake River basin and other parts of Idaho. Generalized skew coefficients aid in shaping flood-frequency curves because skew coefficients computed from gaging stations having relatively short periods of peak flow records can be unreliable. Generalized skew coefficients can be obtained for a gaging station from one of three maps in this report. The map to be used depends on whether (1) snowmelt floods are domiant (generally when more than 20 percent of the drainage area is above 6,000 feet altitude), (2) rainstorm floods are dominant (generally when the mean altitude is less than 3,000 feet), or (3) either snowmelt or rainstorm floods can be the annual miximum discharge. For the latter case, frequency curves constructed using separate arrays of each type of runoff can be combined into one curve, which, for some stations, is significantly different than the frequency curve constructed using only annual maximum discharges. For 269 gaging stations, flood-frequency curves that include the generalized skew coefficients in the computation of the log-Pearson type III equation tend to fit the data better than previous analyses. Frequency curves for ungaged sites can be derived by estimating three statistics of the log-Pearson type III distribution. The mean and standard deviation of logarithms of annual maximum discharges are estimated by regression equations that use basin characteristics as independent variables. Skew coefficient estimates are the generalized skews. The log-Pearson type III equation is then applied with the three estimated statistics to compute the discharge at selected exceedance probabilities. Standard errors at the 2-percent exceedance probability range from 41 to 90 percent. (USGS)

  2. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    PubMed

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Fractional Flow Reserve and Coronary Computed Tomographic Angiography: A Review and Critical Analysis.

    PubMed

    Hecht, Harvey S; Narula, Jagat; Fearon, William F

    2016-07-08

    Invasive fractional flow reserve (FFR) is now the gold standard for intervention. Noninvasive functional imaging analyses derived from coronary computed tomographic angiography (CTA) offer alternatives for evaluating lesion-specific ischemia. CT-FFR, CT myocardial perfusion imaging, and transluminal attenuation gradient/corrected contrast opacification have been studied using invasive FFR as the gold standard. CT-FFR has demonstrated significant improvement in specificity and positive predictive value compared with CTA alone for predicting FFR of ≤0.80, as well as decreasing the frequency of nonobstructive invasive coronary angiography. High-risk plaque characteristics have also been strongly implicated in abnormal FFR. Myocardial computed tomographic perfusion is an alternative method with promising results; it involves more radiation and contrast. Transluminal attenuation gradient/corrected contrast opacification is more controversial and may be more related to vessel diameter than stenosis. Important considerations remain: (1) improvement of CTA quality to decrease unevaluable studies, (2) is the diagnostic accuracy of CT-FFR sufficient? (3) can CT-FFR guide intervention without invasive FFR confirmation? (4) what are the long-term outcomes of CT-FFR-guided treatment and how do they compare with other functional imaging-guided paradigms? (5) what degree of stenosis on CTA warrants CT-FFR? (6) how should high-risk plaque be incorporated into treatment decisions? (7) how will CT-FFR influence other functional imaging test utilization, and what will be the effect on the practice of cardiology? (8) will a workstation-based CT-FFR be mandatory? Rapid progress to date suggests that CTA-based lesion-specific ischemia will be the gatekeeper to the cardiac catheterization laboratory and will transform the world of intervention. © 2016 American Heart Association, Inc.

  4. Modeling functional Magnetic Resonance Imaging (fMRI) experimental variables in the Ontology of Experimental Variables and Values (OoEVV)

    PubMed Central

    Burns, Gully A.P.C.; Turner, Jessica A.

    2015-01-01

    Neuroimaging data is raw material for cognitive neuroscience experiments, leading to scientific knowledge about human neurological and psychological disease, language, perception, attention and ultimately, cognition. The structure of the variables used in the experimental design defines the structure of the data gathered in the experiments; this in turn structures the interpretative assertions that may be presented as experimental conclusions. Representing these assertions and the experimental data which support them in a computable way means that they could be used in logical reasoning environments, i.e. for automated meta-analyses, or linking hypotheses and results across different levels of neuroscientific experiments. Therefore, a crucial first step in being able to represent neuroimaging results in a clear, computable way is to develop representations for the scientific variables involved in neuroimaging experiments. These representations should be expressive, computable, valid, extensible, and easy-to-use. They should also leverage existing semantic standards to interoperate easily with other systems. We present an ontology design pattern called the Ontology of Experimental Variables and Values (OoEVV). This is designed to provide a lightweight framework to capture mathematical properties of data, with appropriate ‘hooks’ to permit linkage to other ontology-driven projects (such as the Ontology of Biomedical Investigations, OBI). We instantiate the OoEVV system with a small number of functional Magnetic Resonance Imaging datasets, to demonstrate the system’s ability to describe the variables of a neuroimaging experiment. OoEVV is designed to be compatible with the XCEDE neuroimaging data standard for data collection terminology, and with the Cognitive Paradigm Ontology (CogPO) for specific reasoning elements of neuroimaging experimental designs. PMID:23684873

  5. Stability and change in screen-based sedentary behaviours and associated factors among Norwegian children in the transition between childhood and adolescence

    PubMed Central

    2012-01-01

    Background In order to inform interventions to prevent sedentariness, more longitudinal studies are needed focusing on stability and change over time in multiple sedentary behaviours. This paper investigates patterns of stability and change in TV/DVD use, computer/electronic game use and total screen time (TST) and factors associated with these patterns among Norwegian children in the transition between childhood and adolescence. Methods The baseline of this longitudinal study took place in September 2007 and included 975 students from 25 control schools of an intervention study, the HEalth In Adolescents (HEIA) study. The first follow-up took place in May 2008 and the second follow-up in May 2009, with 885 students participating at all time points (average age at baseline = 11.2, standard deviation ± 0.3). Time used for/spent on TV/DVD and computer/electronic games was self-reported, and a TST variable (hours/week) was computed. Tracking analyses based on absolute and rank measures, as well as regression analyses to assess factors associated with change in TST and with tracking high TST were conducted. Results Time spent on all sedentary behaviours investigated increased in both genders. Findings based on absolute and rank measures revealed a fair to moderate level of tracking over the 2 year period. High parental education was inversely related to an increase in TST among females. In males, self-efficacy related to barriers to physical activity and living with married or cohabitating parents were inversely related to an increase in TST. Factors associated with tracking high vs. low TST in the multinomial regression analyses were low self-efficacy and being of an ethnic minority background among females, and low self-efficacy, being overweight/obese and not living with married or cohabitating parents among males. Conclusions Use of TV/DVD and computer/electronic games increased with age and tracked over time in this group of 11-13 year old Norwegian children. Interventions targeting these sedentary behaviours should thus be introduced early. The identified modifiable and non-modifiable factors associated with change in TST and tracking of high TST should be taken into consideration when planning such interventions. PMID:22309715

  6. Designing for deeper learning in a blended computer science course for middle school students

    NASA Astrophysics Data System (ADS)

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-04-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.

  7. SSME main combustion chamber and nozzle flowfield analysis

    NASA Technical Reports Server (NTRS)

    Farmer, R. C.; Wang, T. S.; Smith, S. D.; Prozan, R. J.

    1986-01-01

    An investigation is presented of the computational fluid dynamics (CFD) tools which would accurately analyze main combustion chamber and nozzle flow. The importance of combustion phenomena and local variations in mixture ratio are fully appreciated; however, the computational aspects of the gas dynamics involved were the sole issues addressed. The CFD analyses made are first compared with conventional nozzle analyses to determine the accuracy for steady flows, and then transient analyses are discussed.

  8. Some Observations on Damage Tolerance Analyses in Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Dawicke, David S.; Hampton, Roy W.

    2017-01-01

    AIAA standards S080 and S081 are applicable for certification of metallic pressure vessels (PV) and composite overwrap pressure vessels (COPV), respectively. These standards require damage tolerance analyses with a minimum reliable detectible flaw/crack and demonstration of safe life four times the service life with these cracks at the worst-case location in the PVs and oriented perpendicular to the maximum principal tensile stress. The standards require consideration of semi-elliptical surface cracks in the range of aspect ratios (crack depth a to half of the surface length c, i.e., (a/c) of 0.2 to 1). NASA-STD-5009 provides the minimum reliably detectible standard crack sizes (90/95 probability of detection (POD) for several non-destructive evaluation (NDE) methods (eddy current (ET), penetrant (PT), radiography (RT) and ultrasonic (UT)) for the two limits of the aspect ratio range required by the AIAA standards. This paper tries to answer the questions: can the safe life analysis consider only the life for the crack sizes at the two required limits, or endpoints, of the (a/c) range for the NDE method used or does the analysis need to consider values within that range? What would be an appropriate method to interpolate 90/95 POD crack sizes at intermediate (a/c) values? Several procedures to develop combinations of a and c within the specified range are explored. A simple linear relationship between a and c is chosen to compare the effects of seven different approaches to determine combinations of aj and cj that are between the (a/c) endpoints. Two of the seven are selected for evaluation: Approach I, the simple linear relationship, and a more conservative option, Approach III. For each of these two Approaches, the lives are computed for initial semi-elliptic crack configurations in a plate subjected to remote tensile fatigue loading with an R-ratio of 0.1, for an assumed material evaluated using NASGRO (registered 4) version 8.1. These calculations demonstrate that for this loading, using Approach I and the initial detectable crack sizes at the (a/c) endpoints in 5009 specified for the ET and UT NDE methods, the smallest life is not at the two required limits of the (a/c) range, but rather is at an intermediate configuration in the range (a/c) of 0.4 to 0.6. Similar analyses using both Approach I and III with the initial detectable crack size at the (a/c) endpoints in 5009 for PT NDE showed the smallest life may be at an (a/c) endpoint or an intermediate (a/c), depending upon which Approach is used. As such, analyses that interrogate only the two (a/c) values of 0.2 and 1 may result in unconservative life predictions. The standard practice may need to be revised based on these results.

  9. Monitoring and modeling to predict Escherichia coli at Presque Isle Beach 2, City of Erie, Erie County, Pennsylvania

    USGS Publications Warehouse

    Zimmerman, Tammy M.

    2006-01-01

    The Lake Erie shoreline in Pennsylvania spans nearly 40 miles and is a valuable recreational resource for Erie County. Nearly 7 miles of the Lake Erie shoreline lies within Presque Isle State Park in Erie, Pa. Concentrations of Escherichia coli (E. coli) bacteria at permitted Presque Isle beaches occasionally exceed the single-sample bathing-water standard, resulting in unsafe swimming conditions and closure of the beaches. E. coli concentrations and other water-quality and environmental data collected at Presque Isle Beach 2 during the 2004 and 2005 recreational seasons were used to develop models using tobit regression analyses to predict E. coli concentrations. All variables statistically related to E. coli concentrations were included in the initial regression analyses, and after several iterations, only those explanatory variables that made the models significantly better at predicting E. coli concentrations were included in the final models. Regression models were developed using data from 2004, 2005, and the combined 2-year dataset. Variables in the 2004 model and the combined 2004-2005 model were log10 turbidity, rain weight, wave height (calculated), and wind direction. Variables in the 2005 model were log10 turbidity and wind direction. Explanatory variables not included in the final models were water temperature, streamflow, wind speed, and current speed; model results indicated these variables did not meet significance criteria at the 95-percent confidence level (probabilities were greater than 0.05). The predicted E. coli concentrations produced by the models were used to develop probabilities that concentrations would exceed the single-sample bathing-water standard for E. coli of 235 colonies per 100 milliliters. Analysis of the exceedence probabilities helped determine a threshold probability for each model, chosen such that the correct number of exceedences and nonexceedences was maximized and the number of false positives and false negatives was minimized. Future samples with computed exceedence probabilities higher than the selected threshold probability, as determined by the model, will likely exceed the E. coli standard and a beach advisory or closing may need to be issued; computed exceedence probabilities lower than the threshold probability will likely indicate the standard will not be exceeded. Additional data collected each year can be used to test and possibly improve the model. This study will aid beach managers in more rapidly determining when waters are not safe for recreational use and, subsequently, when to issue beach advisories or closings.

  10. A dc model for power switching transistors suitable for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Wilson, P. M.; George, R. T., Jr.; Owen, H. A.; Wilson, T. G.

    1979-01-01

    A model for bipolar junction power switching transistors whose parameters can be readily obtained by the circuit design engineer, and which can be conveniently incorporated into standard computer-based circuit analysis programs is presented. This formulation results from measurements which may be made with standard laboratory equipment. Measurement procedures, as well as a comparison between actual and computed results, are presented.

  11. Utilization of computer technology by science teachers in public high schools and the impact of standardized testing

    NASA Astrophysics Data System (ADS)

    Priest, Richard Harding

    A significant percentage of high school science teachers are not using computers to teach their students or prepare them for standardized testing. A survey of high school science teachers was conducted to determine how they are having students use computers in the classroom, why science teachers are not using computers in the classroom, which variables were relevant to their not using computers, and what are the effects of standardized testing on the use of technology in the high school science classroom. A self-administered questionnaire was developed to measure these aspects of computer integration and demographic information. A follow-up telephone interview survey of a portion of the original sample was conducted in order to clarify questions, correct misunderstandings, and to draw out more holistic descriptions from the subjects. The primary method used to analyze the quantitative data was frequency distributions. Multiple regression analysis was used to investigate the relationships between the barriers and facilitators and the dimensions of instructional use, frequency, and importance of the use of computers. All high school science teachers in a large urban/suburban school district were sent surveys. A response rate of 58% resulted from two mailings of the survey. It was found that contributing factors to why science teachers do not use computers were not enough up-to-date computers in their classrooms and other educational commitments and duties do not leave them enough time to prepare lessons that include technology. While a high percentage of science teachers thought their school and district administrations were supportive of technology, they also believed more inservice technology training and follow-up activities to support that training are needed and more software needs to be created. The majority of the science teachers do not use the computer to help students prepare for standardized tests because they believe they can prepare students more efficiently without a computer. Nearly half of the teachers, however, gave lack of time to prepare instructional materials and lack of a means to project a computer image to the whole class as reasons they do not use computers. A significant percentage thought science standardized testing was having a negative effect on computer use.

  12. Computational Analyses of Offset Stream Nozzles for Noise Reduction

    NASA Technical Reports Server (NTRS)

    Dippold, Vance, III; Foster, Lancert; Wiese,Michael

    2007-01-01

    The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.

  13. Unix becoming healthcare's standard operating system.

    PubMed

    Gardner, E

    1991-02-11

    An unfamiliar buzzword is making its way into healthcare executives' vocabulary, as well as their computer systems. Unix is being touted by many industry observers as the most likely candidate to be a standard operating system for minicomputers, mainframes and computer networks.

  14. CYBER-205 Devectorizer

    NASA Technical Reports Server (NTRS)

    Lakeotes, Christopher D.

    1990-01-01

    DEVECT (CYBER-205 Devectorizer) is CYBER-205 FORTRAN source-language-preprocessor computer program reducing vector statements to standard FORTRAN. In addition, DEVECT has many other standard and optional features simplifying conversion of vector-processor programs for CYBER 200 to other computers. Written in FORTRAN IV.

  15. Illinois Occupational Skill Standards: Information Technology Operate Cluster.

    ERIC Educational Resources Information Center

    Illinois Occupational Skill Standards and Credentialing Council, Carbondale.

    This document contains Illinois Occupational Skill Standards for occupations in the Information Technology Operate Cluster (help desk support, computer maintenance and technical support technician, systems operator, application and computer support specialist, systems administrator, network administrator, and database administrator). The skill…

  16. ARES (Automated Residential Energy Standard) 1.2: User`s guide, in support of proposed interim energy conservation voluntary performance standards for new non-federal residential buildings: Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The ARES (Automated Residential Energy Standard) User`s Guide is designed to the user successfully operate the ARES computer program. This guide assumes that the user is familiar with basic PC skills such as using a keyboard and loading a disk drive. The ARES computer program was designed to assist building code officials in creating a residential energy standard based on local climate and costs.

  17. Computational Fluid Dynamics Assessment Associated with Transcatheter Heart Valve Prostheses: A Position Paper of the ISO Working Group.

    PubMed

    Wei, Zhenglun Alan; Sonntag, Simon Johannes; Toma, Milan; Singh-Gryzbon, Shelly; Sun, Wei

    2018-04-19

    The governing international standard for the development of prosthetic heart valves is International Organization for Standardization (ISO) 5840. This standard requires the assessment of the thrombus potential of transcatheter heart valve substitutes using an integrated thrombus evaluation. Besides experimental flow field assessment and ex vivo flow testing, computational fluid dynamics is a critical component of this integrated approach. This position paper is intended to provide and discuss best practices for the setup of a computational model, numerical solving, post-processing, data evaluation and reporting, as it relates to transcatheter heart valve substitutes. This paper is not intended to be a review of current computational technology; instead, it represents the position of the ISO working group consisting of experts from academia and industry with regards to considerations for computational fluid dynamic assessment of transcatheter heart valve substitutes.

  18. ParallelStructure: A R Package to Distribute Parallel Runs of the Population Genetics Program STRUCTURE on Multi-Core Computers

    PubMed Central

    Besnier, Francois; Glover, Kevin A.

    2013-01-01

    This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012

  19. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  20. Higgs boson mass in the standard model at two-loop order and beyond

    DOE PAGES

    Martin, Stephen P.; Robertson, David G.

    2014-10-01

    We calculate the mass of the Higgs boson in the standard model in terms of the underlying Lagrangian parameters at complete 2-loop order with leading 3-loop corrections. A computer program implementing the results is provided. The program also computes and minimizes the standard model effective potential in Landau gauge at 2-loop order with leading 3-loop corrections.

  1. Generalized environmental control and life support system computer program (G189A) configuration control. [computer subroutine libraries for shuttle orbiter analyses

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.

    1973-01-01

    A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.

  2. MOVES-Matrix and distributed computing for microscale line source dispersion analysis.

    PubMed

    Liu, Haobing; Xu, Xiaodan; Rodgers, Michael O; Xu, Yanzhi Ann; Guensler, Randall L

    2017-07-01

    MOVES and AERMOD are the U.S. Environmental Protection Agency's recommended models for use in project-level transportation conformity and hot-spot analysis. However, the structure and algorithms involved in running MOVES make analyses cumbersome and time-consuming. Likewise, the modeling setup process, including extensive data requirements and required input formats, in AERMOD lead to a high potential for analysis error in dispersion modeling. This study presents a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix, a high-performance emission modeling tool, with the microscale dispersion models CALINE4 and AERMOD. MOVES-Matrix was prepared by iteratively running MOVES across all possible iterations of vehicle source-type, fuel, operating conditions, and environmental parameters to create a huge multi-dimensional emission rate lookup matrix. AERMOD and CALINE4 are connected with MOVES-Matrix in a distributed computing cluster using a series of Python scripts. This streamlined system built on MOVES-Matrix generates exactly the same emission rates and concentration results as using MOVES with AERMOD and CALINE4, but the approach is more than 200 times faster than using the MOVES graphical user interface. Because AERMOD requires detailed meteorological input, which is difficult to obtain, this study also recommends using CALINE4 as a screening tool for identifying the potential area that may exceed air quality standards before using AERMOD (and identifying areas that are exceedingly unlikely to exceed air quality standards). CALINE4 worst case method yields consistently higher concentration results than AERMOD for all comparisons in this paper, as expected given the nature of the meteorological data employed. The paper demonstrates a distributed computing method for line source dispersion modeling that integrates MOVES-Matrix with the CALINE4 and AERMOD. This streamlined system generates exactly the same emission rates and concentration results as traditional way to use MOVES with AERMOD and CALINE4, which are regulatory models approved by the U.S. EPA for conformity analysis, but the approach is more than 200 times faster than implementing the MOVES model. We highlighted the potentially significant benefit of using CALINE4 as screening tool for identifying potential area that may exceeds air quality standards before using AERMOD, which requires much more meteorology input than CALINE4.

  3. Triplet supertree heuristics for the tree of life

    PubMed Central

    Lin, Harris T; Burleigh, J Gordon; Eulenstein, Oliver

    2009-01-01

    Background There is much interest in developing fast and accurate supertree methods to infer the tree of life. Supertree methods combine smaller input trees with overlapping sets of taxa to make a comprehensive phylogenetic tree that contains all of the taxa in the input trees. The intrinsically hard triplet supertree problem takes a collection of input species trees and seeks a species tree (supertree) that maximizes the number of triplet subtrees that it shares with the input trees. However, the utility of this supertree problem has been limited by a lack of efficient and effective heuristics. Results We introduce fast hill-climbing heuristics for the triplet supertree problem that perform a step-wise search of the tree space, where each step is guided by an exact solution to an instance of a local search problem. To realize time efficient heuristics we designed the first nontrivial algorithms for two standard search problems, which greatly improve on the time complexity to the best known (naïve) solutions by a factor of n and n2 (the number of taxa in the supertree). These algorithms enable large-scale supertree analyses based on the triplet supertree problem that were previously not possible. We implemented hill-climbing heuristics that are based on our new algorithms, and in analyses of two published supertree data sets, we demonstrate that our new heuristics outperform other standard supertree methods in maximizing the number of triplets shared with the input trees. Conclusion With our new heuristics, the triplet supertree problem is now computationally more tractable for large-scale supertree analyses, and it provides a potentially more accurate alternative to existing supertree methods. PMID:19208181

  4. Russian norms for name agreement, image agreement for the colorized version of the Snodgrass and Vanderwart pictures and age of acquisition, conceptual familiarity, and imageability scores for modal object names.

    PubMed

    Tsaparina, Diana; Bonin, Patrick; Méot, Alain

    2011-12-01

    The aim of the present study was to provide Russian normative data for the Snodgrass and Vanderwart (Behavior Research Methods, Instruments, & Computers, 28, 516-536, 1980) colorized pictures (Rossion & Pourtois, Perception, 33, 217-236, 2004). The pictures were standardized on name agreement, image agreement, conceptual familiarity, imageability, and age of acquisition. Objective word frequency and objective visual complexity measures are also provided for the most common names associated with the pictures. Comparative analyses between our results and the norms obtained in other, similar studies are reported. The Russian norms may be downloaded from the Psychonomic Society supplemental archive.

  5. Analytic theory of photoacoustic wave generation from a spheroidal droplet.

    PubMed

    Li, Yong; Fang, Hui; Min, Changjun; Yuan, Xiaocong

    2014-08-25

    In this paper, we develop an analytic theory for describing the photoacoustic wave generation from a spheroidal droplet and derive the first complete analytic solution. Our derivation is based on solving the photoacoustic Helmholtz equation in spheroidal coordinates with the separation-of-variables method. As the verification, besides carrying out the asymptotic analyses which recover the standard solutions for a sphere, an infinite cylinder and an infinite layer, we also confirm that the partial transmission and reflection model previously demonstrated for these three geometries still stands. We expect that this analytic solution will find broad practical uses in interpreting experiment results, considering that its building blocks, the spheroidal wave functions (SWFs), can be numerically calculated by the existing computer programs.

  6. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  7. Interface standards for computer equipment

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The ability to configure data systems using modules provided by independent manufacturers is complicated by the wide range of electrical, mechanical, and functional characteristics exhibited within the equipment provided by different manufacturers of computers, peripherals, and terminal devices. A number of international organizations were and still are involved in the creation of standards that enable devices to be interconnected with minimal difficulty, usually involving only a cable or data bus connection that is defined by the standard. The elements covered by an interface standard are covered and the most prominent interface standards presently in use are identified and described.

  8. Meeting report from the first meetings of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Le Novère, Nicolas; Hucka, Michael; Anwar, Nadia; Bader, Gary D; Demir, Emek; Moodie, Stuart; Sorokin, Anatoly

    2011-01-01

    The Computational Modeling in Biology Network (COMBINE), is an initiative to coordinate the development of the various community standards and formats in computational systems biology and related fields. This report summarizes the activities pursued at the first annual COMBINE meeting held in Edinburgh on October 6-9 2010 and the first HARMONY hackathon, held in New York on April 18-22 2011. The first of those meetings hosted 81 attendees. Discussions covered both official COMBINE standards-(BioPAX, SBGN and SBML), as well as emerging efforts and interoperability between different formats. The second meeting, oriented towards software developers, welcomed 59 participants and witnessed many technical discussions, development of improved standards support in community software systems and conversion between the standards. Both meetings were resounding successes and showed that the field is now mature enough to develop representation formats and related standards in a coordinated manner. PMID:22180826

  9. Meeting report from the first meetings of the Computational Modeling in Biology Network (COMBINE).

    PubMed

    Le Novère, Nicolas; Hucka, Michael; Anwar, Nadia; Bader, Gary D; Demir, Emek; Moodie, Stuart; Sorokin, Anatoly

    2011-11-30

    The Computational Modeling in Biology Network (COMBINE), is an initiative to coordinate the development of the various community standards and formats in computational systems biology and related fields. This report summarizes the activities pursued at the first annual COMBINE meeting held in Edinburgh on October 6-9 2010 and the first HARMONY hackathon, held in New York on April 18-22 2011. The first of those meetings hosted 81 attendees. Discussions covered both official COMBINE standards-(BioPAX, SBGN and SBML), as well as emerging efforts and interoperability between different formats. The second meeting, oriented towards software developers, welcomed 59 participants and witnessed many technical discussions, development of improved standards support in community software systems and conversion between the standards. Both meetings were resounding successes and showed that the field is now mature enough to develop representation formats and related standards in a coordinated manner.

  10. Assessment of modal-pushover-based scaling procedure for nonlinear response history analysis of ordinary standard bridges

    USGS Publications Warehouse

    Kalkan, E.; Kwong, N.

    2012-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case in the central United States) or when high-intensity records are needed (as is the case in San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure was recently developed to determine scale factors for a small number of records such that the scaled records provide accurate and efficient estimates of “true” median structural responses. The adjective “accurate” refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective “efficient” refers to the record-to-record variability of responses. In this paper, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing Ordinary Standard bridges typical of reinforced concrete bridge construction in California. These bridges are the single-bent overpass, multi-span bridge, curved bridge, and skew bridge. As compared with benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the EDPs. Thus, it is a useful tool for scaling ground motions as input to nonlinear RHAs of Ordinary Standard bridges.

  11. Documentation for assessment of modal pushover-based scaling procedure for nonlinear response history analysis of "ordinary standard" bridges

    USGS Publications Warehouse

    Kalkan, Erol; Kwong, Neal S.

    2010-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground-motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case for the central United States), or when high-intensity records are needed (as is the case for San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure recently was developed to determine scale factors for a small number of records, such that the scaled records provide accurate and efficient estimates of 'true' median structural responses. The adjective 'accurate' refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective 'efficient' refers to the record-to-record variability of responses. Herein, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing 'ordinary standard' bridges typical of reinforced-concrete bridge construction in California. These bridges are the single-bent overpass, multi span bridge, curved-bridge, and skew-bridge. As compared to benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the responses. Thus, the MPS procedure is a useful tool for scaling ground motions as input to nonlinear RHAs of 'ordinary standard' bridges.

  12. LaRC design analysis report for National Transonic Facility for 304 stainless steel tunnel shell. Volume 1S: Finite difference analysis of cone/cylinder junction

    NASA Technical Reports Server (NTRS)

    Ramsey, J. W., Jr.; Taylor, J. T.; Wilson, J. F.; Gray, C. E., Jr.; Leatherman, A. D.; Rooker, J. R.; Allred, J. W.

    1976-01-01

    The results of extensive computer (finite element, finite difference and numerical integration), thermal, fatigue, and special analyses of critical portions of a large pressurized, cryogenic wind tunnel (National Transonic Facility) are presented. The computer models, loading and boundary conditions are described. Graphic capability was used to display model geometry, section properties, and stress results. A stress criteria is presented for evaluation of the results of the analyses. Thermal analyses were performed for major critical and typical areas. Fatigue analyses of the entire tunnel circuit are presented.

  13. Advanced quantitative methods in correlating sarcopenic muscle degeneration with lower extremity function biometrics and comorbidities

    PubMed Central

    Gíslason, Magnús; Sigurðsson, Sigurður; Guðnason, Vilmundur; Harris, Tamara; Carraro, Ugo; Gargiulo, Paolo

    2018-01-01

    Sarcopenic muscular degeneration has been consistently identified as an independent risk factor for mortality in aging populations. Recent investigations have realized the quantitative potential of computed tomography (CT) image analysis to describe skeletal muscle volume and composition; however, the optimum approach to assessing these data remains debated. Current literature reports average Hounsfield unit (HU) values and/or segmented soft tissue cross-sectional areas to investigate muscle quality. However, standardized methods for CT analyses and their utility as a comorbidity index remain undefined, and no existing studies compare these methods to the assessment of entire radiodensitometric distributions. The primary aim of this study was to present a comparison of nonlinear trimodal regression analysis (NTRA) parameters of entire radiodensitometric muscle distributions against extant CT metrics and their correlation with lower extremity function (LEF) biometrics (normal/fast gait speed, timed up-and-go, and isometric leg strength) and biochemical and nutritional parameters, such as total solubilized cholesterol (SCHOL) and body mass index (BMI). Data were obtained from 3,162 subjects, aged 66–96 years, from the population-based AGES-Reykjavik Study. 1-D k-means clustering was employed to discretize each biometric and comorbidity dataset into twelve subpopulations, in accordance with Sturges’ Formula for Class Selection. Dataset linear regressions were performed against eleven NTRA distribution parameters and standard CT analyses (fat/muscle cross-sectional area and average HU value). Parameters from NTRA and CT standards were analogously assembled by age and sex. Analysis of specific NTRA parameters with standard CT results showed linear correlation coefficients greater than 0.85, but multiple regression analysis of correlative NTRA parameters yielded a correlation coefficient of 0.99 (P<0.005). These results highlight the specificities of each muscle quality metric to LEF biometrics, SCHOL, and BMI, and particularly highlight the value of the connective tissue regime in this regard. PMID:29513690

  14. Advanced quantitative methods in correlating sarcopenic muscle degeneration with lower extremity function biometrics and comorbidities.

    PubMed

    Edmunds, Kyle; Gíslason, Magnús; Sigurðsson, Sigurður; Guðnason, Vilmundur; Harris, Tamara; Carraro, Ugo; Gargiulo, Paolo

    2018-01-01

    Sarcopenic muscular degeneration has been consistently identified as an independent risk factor for mortality in aging populations. Recent investigations have realized the quantitative potential of computed tomography (CT) image analysis to describe skeletal muscle volume and composition; however, the optimum approach to assessing these data remains debated. Current literature reports average Hounsfield unit (HU) values and/or segmented soft tissue cross-sectional areas to investigate muscle quality. However, standardized methods for CT analyses and their utility as a comorbidity index remain undefined, and no existing studies compare these methods to the assessment of entire radiodensitometric distributions. The primary aim of this study was to present a comparison of nonlinear trimodal regression analysis (NTRA) parameters of entire radiodensitometric muscle distributions against extant CT metrics and their correlation with lower extremity function (LEF) biometrics (normal/fast gait speed, timed up-and-go, and isometric leg strength) and biochemical and nutritional parameters, such as total solubilized cholesterol (SCHOL) and body mass index (BMI). Data were obtained from 3,162 subjects, aged 66-96 years, from the population-based AGES-Reykjavik Study. 1-D k-means clustering was employed to discretize each biometric and comorbidity dataset into twelve subpopulations, in accordance with Sturges' Formula for Class Selection. Dataset linear regressions were performed against eleven NTRA distribution parameters and standard CT analyses (fat/muscle cross-sectional area and average HU value). Parameters from NTRA and CT standards were analogously assembled by age and sex. Analysis of specific NTRA parameters with standard CT results showed linear correlation coefficients greater than 0.85, but multiple regression analysis of correlative NTRA parameters yielded a correlation coefficient of 0.99 (P<0.005). These results highlight the specificities of each muscle quality metric to LEF biometrics, SCHOL, and BMI, and particularly highlight the value of the connective tissue regime in this regard.

  15. Comparability of a Paper-Based Language Test and a Computer-Based Language Test.

    ERIC Educational Resources Information Center

    Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool

    2003-01-01

    Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…

  16. Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows

    NASA Technical Reports Server (NTRS)

    Givi, Peyman; Madnia, Cyrus K.; Steinberger, C. J.; Frankel, S. H.

    1992-01-01

    The principal objective is to extend the boundaries within which large eddy simulations (LES) and direct numerical simulations (DNS) can be applied in computational analyses of high speed reacting flows. A summary of work accomplished during the last six months is presented.

  17. Invited commentary: G-computation--lost in translation?

    PubMed

    Vansteelandt, Stijn; Keiding, Niels

    2011-04-01

    In this issue of the Journal, Snowden et al. (Am J Epidemiol. 2011;173(7):731-738) give a didactic explanation of G-computation as an approach for estimating the causal effect of a point exposure. The authors of the present commentary reinforce the idea that their use of G-computation is equivalent to a particular form of model-based standardization, whereby reference is made to the observed study population, a technique that epidemiologists have been applying for several decades. They comment on the use of standardized versus conditional effect measures and on the relative predominance of the inverse probability-of-treatment weighting approach as opposed to G-computation. They further propose a compromise approach, doubly robust standardization, that combines the benefits of both of these causal inference techniques and is not more difficult to implement.

  18. Improved programs for DNA and protein sequence analysis on the IBM personal computer and other standard computer systems.

    PubMed Central

    Mount, D W; Conrad, B

    1986-01-01

    We have previously described programs for a variety of types of sequence analysis (1-4). These programs have now been integrated into a single package. They are written in the standard C programming language and run on virtually any computer system with a C compiler, such as the IBM/PC and other computers running under the MS/DOS and UNIX operating systems. The programs are widely distributed and may be obtained from the authors as described below. PMID:3753780

  19. Open Systems Interconnection.

    ERIC Educational Resources Information Center

    Denenberg, Ray

    1985-01-01

    Discusses the need for standards allowing computer-to-computer communication and gives examples of technical issues. The seven-layer framework of the Open Systems Interconnection (OSI) Reference Model is explained and illustrated. Sidebars feature public data networks and Recommendation X.25, OSI standards, OSI layer functions, and a glossary.…

  20. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...

  1. A randomized, controlled trial of interactive, multimedia software for patient colonoscopy education.

    PubMed

    Shaw, M J; Beebe, T J; Tomshine, P A; Adlis, S A; Cass, O W

    2001-02-01

    The purpose of our study was to assess the effectiveness of computer-assisted instruction (CAI) in patients having colonoscopies. We conducted a randomized, controlled trial in large, multispecialty clinic. Eighty-six patients were referred for colonoscopies. The interventions were standard education versus standard education plus CAI, and the outcome measures were anxiety, comprehension, and satisfaction. Computer-assisted instruction had no effect on patients' anxiety. The group receiving CAI demonstrated better overall comprehension (p < 0.001). However, Comprehension of certain aspects of serious complications and appropriate postsedation behavior were unaffected by educational method. Patients in the CAI group were more likely to indicate satisfaction with the amount of information provided when compared with the standard education counterparts (p = 0.001). Overall satisfaction was unaffected by educational method. Computer-assisted instruction for colonoscopy provided better comprehension and greater satisfaction with the adequacy of education than standard education. Computer-assisted instruction helps physicians meet their educational responsibilities with no decrement to the interpersonal aspects of the patient-physician relationship.

  2. RECOLA2: REcursive Computation of One-Loop Amplitudes 2

    NASA Astrophysics Data System (ADS)

    Denner, Ansgar; Lang, Jean-Nicolas; Uccirati, Sandro

    2018-03-01

    We present the Fortran95 program RECOLA2 for the perturbative computation of next-to-leading-order transition amplitudes in the Standard Model of particle physics and extended Higgs sectors. New theories are implemented via model files in the 't Hooft-Feynman gauge in the conventional formulation of quantum field theory and in the Background-Field method. The present version includes model files for Two-Higgs-Doublet Model and the Higgs-Singlet Extension of the Standard Model. We support standard renormalization schemes for the Standard Model as well as many commonly used renormalization schemes in extended Higgs sectors. Within these models the computation of next-to-leading-order polarized amplitudes and squared amplitudes, optionally summed over spin and colour, is fully automated for any process. RECOLA2 allows the computation of colour- and spin-correlated leading-order squared amplitudes that are needed in the dipole subtraction formalism. RECOLA2 is publicly available for download at http://recola.hepforge.org.

  3. Meeting report from the fourth meeting of the Computational Modeling in Biology Network (COMBINE)

    PubMed Central

    Waltemath, Dagmar; Bergmann, Frank T.; Chaouiya, Claudine; Czauderna, Tobias; Gleeson, Padraig; Goble, Carole; Golebiewski, Martin; Hucka, Michael; Juty, Nick; Krebs, Olga; Le Novère, Nicolas; Mi, Huaiyu; Moraru, Ion I.; Myers, Chris J.; Nickerson, David; Olivier, Brett G.; Rodriguez, Nicolas; Schreiber, Falk; Smith, Lucian; Zhang, Fengkai; Bonnet, Eric

    2014-01-01

    The Computational Modeling in Biology Network (COMBINE) is an initiative to coordinate the development of community standards and formats in computational systems biology and related fields. This report summarizes the topics and activities of the fourth edition of the annual COMBINE meeting, held in Paris during September 16-20 2013, and attended by a total of 96 people. This edition pioneered a first day devoted to modeling approaches in biology, which attracted a broad audience of scientists thanks to a panel of renowned speakers. During subsequent days, discussions were held on many subjects including the introduction of new features in the various COMBINE standards, new software tools that use the standards, and outreach efforts. Significant emphasis went into work on extensions of the SBML format, and also into community-building. This year’s edition once again demonstrated that the COMBINE community is thriving, and still manages to help coordinate activities between different standards in computational systems biology.

  4. Secure corridor for infraacetabular screws in acetabular fracture fixation-a 3-D radiomorphometric analysis of 124 pelvic CT datasets.

    PubMed

    Arlt, Stephan; Noser, Hansrudi; Wienke, Andreas; Radetzki, Florian; Hofmann, Gunther Olaf; Mendel, Thomas

    2018-05-21

    Acetabular fracture surgery is directed toward anatomical reduction and stable fixation to allow for the early functional rehabilitation of an injured hip joint. Recent biomechanical investigations have shown the superiority of using an additional screw in the infraacetabular (IA) region, thereby transfixing the separated columns to strengthen the construct by closing the periacetabular fixation frame. However, the inter-individual existence and variance concerning secure IA screw corridors are poorly understood. This computer-aided 3-D radiomorphometric study examined 124 CT Digital Imaging and Communications in Medicine (DICOM) datasets of intact human pelves (248 acetabula) to visualize the spatial IA corridors as the sum of all intraosseous screw positions. DICOM files were pre-processed using the Amira® 4.2 visualization software. Final corridor computation was accomplished using a custom-made software algorithm. The volumetric measurement data of each corridor were calculated for further statistical analyses. Correlations between the volumetric values and the biometric data were investigated. Furthermore, the influence of hip dysplasia on the IA corridor configuration was analyzed. The IA corridors consistently showed a double-cone shape with the isthmus located at the acetabular fovea. In 97% of male and 91% of female acetabula, a corridor for a 3.5-mm screw could be found. The number of IA corridors was significantly lower in females for screw diameters ≥ 4.5 mm. The mean 3.5-mm screw corridor volume was 16 cm 3 in males and 9.2 cm 3 in female pelves. Corridor volumes were significantly positively correlated with body height and weight and with the diameter of Köhler's teardrop on standard AP pelvic X-rays. No correlation was observed between hip dysplasia and the IA corridor extent. IA corridors are consistently smaller in females. However, 3.5-mm small fragment screws may still be used as the standard implant because sex-specific differences are significant only with screw diameters ≥ 4.5 mm. Congenital hip dysplasia does not affect secure IA screw insertion. The described method allows 3-D shape analyses with highly reliable results. The visualization of secure IA corridors may support the spatial awareness of surgeons. Volumetric data allow the reliable assessment of individual IA corridors using standard AP X-ray views, which aids preoperative planning.

  5. Strategies to work with HLA data in human populations for histocompatibility, clinical transplantation, epidemiology and population genetics: HLA-NET methodological recommendations.

    PubMed

    Sanchez-Mazas, A; Vidan-Jeras, B; Nunes, J M; Fischer, G; Little, A-M; Bekmane, U; Buhler, S; Buus, S; Claas, F H J; Dormoy, A; Dubois, V; Eglite, E; Eliaou, J F; Gonzalez-Galarza, F; Grubic, Z; Ivanova, M; Lie, B; Ligeiro, D; Lokki, M L; da Silva, B Martins; Martorell, J; Mendonça, D; Middleton, D; Voniatis, D Papioannou; Papasteriades, C; Poli, F; Riccio, M E; Vlachou, M Spyropoulou; Sulcebe, G; Tonks, S; Nevessignsky, M Toungouz; Vangenot, C; van Walraven, A-M; Tiercy, J-M

    2012-12-01

    HLA-NET (a European COST Action) aims at networking researchers working in bone marrow transplantation, epidemiology and population genetics to improve the molecular characterization of the HLA genetic diversity of human populations, with an expected strong impact on both public health and fundamental research. Such improvements involve finding consensual strategies to characterize human populations and samples and report HLA molecular typings and ambiguities; proposing user-friendly access to databases and computer tools and defining minimal requirements related to ethical aspects. The overall outcome is the provision of population genetic characterizations and comparisons in a standard way by all interested laboratories. This article reports the recommendations of four working groups (WG1-4) of the HLA-NET network at the mid-term of its activities. WG1 (Population definitions and sampling strategies for population genetics' analyses) recommends avoiding outdated racial classifications and population names (e.g. 'Caucasian') and using instead geographic and/or cultural (e.g. linguistic) criteria to describe human populations (e.g. 'pan-European'). A standard 'HLA-NET POPULATION DATA QUESTIONNAIRE' has been finalized and is available for the whole HLA community. WG2 (HLA typing standards for population genetics analyses) recommends retaining maximal information when reporting HLA typing results. Rather than using the National Marrow Donor Program coding system, all ambiguities should be provided by listing all allele pairs required to explain each genotype, according to the formats proposed in 'HLA-NET GUIDELINES FOR REPORTING HLA TYPINGS'. The group also suggests taking into account a preliminary list of alleles defined by polymorphisms outside the peptide-binding sites that may affect population genetic statistics because of significant frequencies. WG3 (Bioinformatic strategies for HLA population data storage and analysis) recommends the use of programs capable of dealing with ambiguous data, such as the 'gene[rate]' computer tools to estimate frequencies, test for Hardy-Weinberg equilibrium and selective neutrality on data containing any number and kind of ambiguities. WG4 (Ethical issues) proposes to adopt thorough general principles for any HLA population study to ensure that it conforms to (inter)national legislation or recommendations/guidelines. All HLA-NET guidelines and tools are available through its website http://hla-net.eu. © 2012 Blackwell Publishing Ltd.

  6. Strategies to work with HLA data in human populations for histocompatibility, clinical transplantation, epidemiology and population genetics: HLA-NET methodological recommendations

    PubMed Central

    Sanchez-Mazas, A; Vidan-Jeras, B; Nunes, J M; Fischer, G; Little, A-M; Bekmane, U; Buhler, S; Buus, S; Claas, F H J; Dormoy, A; Dubois, V; Eglite, E; Eliaou, J F; Gonzalez-Galarza, F; Grubic, Z; Ivanova, M; Lie, B; Ligeiro, D; Lokki, M L; da Silva, B Martins; Martorell, J; Mendonça, D; Middleton, D; Voniatis, D Papioannou; Papasteriades, C; Poli, F; Riccio, M E; Vlachou, M Spyropoulou; Sulcebe, G; Tonks, S; Nevessignsky, M Toungouz; Vangenot, C; van Walraven, A-M; Tiercy, J-M

    2012-01-01

    HLA-NET (a European COST Action) aims at networking researchers working in bone marrow transplantation, epidemiology and population genetics to improve the molecular characterization of the HLA genetic diversity of human populations, with an expected strong impact on both public health and fundamental research. Such improvements involve finding consensual strategies to characterize human populations and samples and report HLA molecular typings and ambiguities; proposing user-friendly access to databases and computer tools and defining minimal requirements related to ethical aspects. The overall outcome is the provision of population genetic characterizations and comparisons in a standard way by all interested laboratories. This article reports the recommendations of four working groups (WG1-4) of the HLA-NET network at the mid-term of its activities. WG1 (Population definitions and sampling strategies for population genetics’ analyses) recommends avoiding outdated racial classifications and population names (e.g. ‘Caucasian’) and using instead geographic and/or cultural (e.g. linguistic) criteria to describe human populations (e.g. ‘pan-European’). A standard ‘HLA-NET POPULATION DATA QUESTIONNAIRE’ has been finalized and is available for the whole HLA community. WG2 (HLA typing standards for population genetics analyses) recommends retaining maximal information when reporting HLA typing results. Rather than using the National Marrow Donor Program coding system, all ambiguities should be provided by listing all allele pairs required to explain each genotype, according to the formats proposed in ‘HLA-NET GUIDELINES FOR REPORTING HLA TYPINGS’. The group also suggests taking into account a preliminary list of alleles defined by polymorphisms outside the peptide-binding sites that may affect population genetic statistics because of significant frequencies. WG3 (Bioinformatic strategies for HLA population data storage and analysis) recommends the use of programs capable of dealing with ambiguous data, such as the ‘gene[rate]’ computer tools to estimate frequencies, test for Hardy–Weinberg equilibrium and selective neutrality on data containing any number and kind of ambiguities. WG4 (Ethical issues) proposes to adopt thorough general principles for any HLA population study to ensure that it conforms to (inter)national legislation or recommendations/guidelines. All HLA-NET guidelines and tools are available through its website http://hla-net.eu. PMID:22533604

  7. Cognitive context detection in UAS operators using eye-gaze patterns on computer screens

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.

  8. Methodological issues in assessing changes in costs pre- and post-medication switch: a schizophrenia study example.

    PubMed

    Faries, Douglas E; Nyhuis, Allen W; Ascher-Svanum, Haya

    2009-05-27

    Schizophrenia is a severe, chronic, and costly illness that adversely impacts patients' lives and health care payer budgets. Cost comparisons of treatment regimens are, therefore, important to health care payers and researchers. Pre-Post analyses ("mirror-image"), where outcomes prior to a medication switch are compared to outcomes post-switch, are commonly used in such research. However, medication changes often occur during a costly crisis event. Patients may relapse, be hospitalized, have a medication change, and then spend a period of time with intense use of costly resources (post-medication switch). While many advantages and disadvantages of Pre-Post methodology have been discussed, issues regarding the attributability of costs incurred around the time of medication switching have not been fully investigated. Medical resource use data, including medications and acute-care services (hospitalizations, partial hospitalizations, emergency department) were collected for patients with schizophrenia who switched antipsychotics (n = 105) during a 1-year randomized, naturalistic, antipsychotic cost-effectiveness schizophrenia trial. Within-patient changes in total costs per day were computed during the pre- and post-medication change periods. In addition to the standard Pre-Post analysis comparing costs pre- and post-medication change, we investigated the sensitivity of results to varying assumptions regarding the attributability of acute care service costs occurring just after a medication switch that were likely due to initial medication failure. Fifty-six percent of all costs incurred during the first week on the newly initiated antipsychotic were likely due to treatment failure with the previous antipsychotic. Standard analyses suggested an average increase in cost-per-day for each patient of $2.40 after switching medications. However, sensitivity analyses removing costs incurred post-switch that were potentially due to the failure of the initial medication suggested decreases in costs in the range of $4.77 to $9.69 per day post-switch. Pre-Post cost analyses are sensitive to the approach used to handle acute-service costs occurring just after a medication change. Given the importance of quality economic research on the cost of switching treatments, thorough sensitivity analyses should be performed to identify the impact of crisis events around the time of medication change.

  9. Hypnotics and driving safety: meta-analyses of randomized controlled trials applying the on-the-road driving test.

    PubMed

    Verster, Joris C; Veldhuijzen, Dieuwke S; Patat, Alain; Olivier, Berend; Volkerts, Edmund R

    2006-01-01

    Many people who use hypnotics are outpatients and are likely to drive a car the day after drug intake. The purpose of these meta-analyses was to determine whether or not this is safe. Placebo-controlled, randomized, double-blind trials were selected if using the on-the-road driving test to determine driving ability the day following one or two nights of treatment administration. Primary outcome measure of the driving test was the Standard Deviation of Lateral Position (SDLP); i.e., the weaving of the car. Fixed effects model meta-analyses were performed. Effect size (ES) was computed using mean standardized (weighted) difference scores between treatment and corresponding placebo SDLP values. Ten studies, published from 1984 to 2002 (207 subjects), were included in the meta-analyses. The morning following bedtime administration, i.e. 10-11 hours after dosing, significant driving impairment was found for the recommended dose of various benzodiazepine hypnotics (ES=0.42; 95% Confidence Interval (CI)=0.14 to 0.71). Twice the recommended dose impaired driving both in the morning (ES=0.68; CI=0.39 to 0.97) and afternoon, i.e. 16-17 hours after dosing (ES=0.57; CI=0.26 to 0.88). Zopiclone 7.5 mg also impaired driving in the morning (ES=0.89; CI=0.54 to 1.23). Zaleplon (10 and 20 mg) and zolpidem (10 mg) did not affect driving performance the morning after dosing. Following middle-of-the-night administration, significantly impaired driving performance was found for zopiclone 7.5 mg (ES=1.51, CI=0.85 to 2.17), zolpidem 10 mg (ES=0.66, CI=0.13 to 1.19) and zolpidem 20 mg (ES=1.16, CI=0.60 to 1.72). Zaleplon (10 and 20 mg) did not affect driving performance. The analyses show that driving a car the morning following nocturnal treatment with benzodiazepines and zopiclone is unsafe, whereas the recommended dose of zolpidem (10 mg) and zaleplon (10 mg) do not affect driving ability.

  10. A computer program for multiple decrement life table analyses.

    PubMed

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  11. C-MOS bulk metal design handbook. [LSI standard cell (circuits)

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1977-01-01

    The LSI standard cell array technique was used in the fabrication of more than 20 CMOS custom arrays. This technique consists of a series of computer programs and design automation techniques referred to as the Computer Aided Design And Test (CADAT) system that automatically translate a partitioned logic diagram into a set of instructions for driving an automatic plotter which generates precision mask artwork for complex LSI arrays of CMOS standard cells. The standard cell concept for producing LSI arrays begins with the design, layout, and validation of a group of custom circuits called standard cells. Once validated, these cells are given identification or pattern numbers and are permanently stored. To use one of these cells in a logic design, the user calls for the desired cell by pattern number. The Place, Route in Two Dimension (PR2D) computer program is then used to automatically generate the metalization and/or tunnels to interconnect the standard cells into the required function. Data sheets that describe the function, artwork, and performance of each of the standard cells, the general procedure for implementation of logic in CMOS standard cells, and additional detailed design information are presented.

  12. Tool for Statistical Analysis and Display of Landing Sites

    NASA Technical Reports Server (NTRS)

    Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John

    2006-01-01

    MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.

  13. Reviews Book: Marie Curie: A Biography Book: Fast Car Physics Book: Beautiful Invisible Equipment: Fun Fly Stick Science Kit Book: Quantum Theory Cannot Hurt You Book: Chaos: The Science of Predictable Random Motion Book: Seven Wonders of the Universe Book: Special Relativity Equipment: LabVIEWTM 2009 Education Edition Places to Visit: Edison and Ford Winter Estates Places to Visit: The Computer History Museum Web Watch

    NASA Astrophysics Data System (ADS)

    2011-07-01

    WE RECOMMEND Fun Fly Stick Science Kit Fun fly stick introduces electrostatics to youngsters Special Relativity Text makes a useful addition to the study of relativity as an undergraduate LabVIEWTM 2009 Education Edition LabVIEW sets industry standard for gathering and analysing data, signal processing, instrumentation design and control, and automation and robotics Edison and Ford Winter Estates Thomas Edison's home is open to the public The Computer History Museum Take a walk through technology history at this computer museum WORTH A LOOK Fast Car Physics Book races through physics Beautiful Invisible The main subject of this book is theoretical physics Quantum Theory Cannot Hurt You A guide to physics on the large and small scale Chaos: The Science of Predictable Random Motion Book explores the mathematics behind chaotic behaviour Seven Wonders of the Universe A textual trip through the wonderful universe HANDLE WITH CARE Marie Curie: A Biography Book fails to capture Curie's science WEB WATCH Web clips to liven up science lessons

  14. Design and analysis of roll cage

    NASA Astrophysics Data System (ADS)

    Angadi, Gurusangappa; Chetan, S.

    2018-04-01

    Wildlife fire fighting vehicles are used to extinguish fires in forests, in this process vehicles face falling objects like rocks, tree branches and other objects. Also due to uneven conditions of the terrain like cliff edges, uneven surfaces etc. makes the vehicle to roll over and these can cause injuries to both the driver and the operator. Roll over of a vehicle is a common incident which makes fatal injuries to the operator and also stands next to the crash accidents. In order to reduce the injury level and continuous roll over of the vehicle it is necessary to equip suitable roll cage according to standards of vehicle. In this present work roll cage for pump operator in wildfire fighting vehicle is designed and analysis is carried out in computer simulated environment when seating position of operator seated outside of the cabin. According to NFPA 1906 standards wildlife fire apparatus, Design and Test procedures that are carried out in Hyperworks maintaining SAE J1194.1983 standards. G load case, roof crush analysis and pendulum impact analysis tests are carried out on roll cage to ensure the saftey of design. These load cases are considerd to satisfy the situation faced in forest terrain. In these test procedures roll cage is analysed for stresses and deformation in various load cases. After recording results these are compared with standards mentioned in SAE J1194.1983.

  15. The CUAHSI Water Data Center: Enabling Data Publication, Discovery and Re-use

    NASA Astrophysics Data System (ADS)

    Seul, M.; Pollak, J.

    2014-12-01

    The CUAHSI Water Data Center (WDC) supports a standards-based, services-oriented architecture for time-series data and provides a separate service to publish spatial data layers as shape files. Two new services that the WDC offers are a cloud-based server (Cloud HydroServer) for publishing data and a web-based client for data discovery. The Cloud HydroServer greatly simplifies data publication by eliminating the need for scientists to set up an SQL-server data base, a requirement that has proven to be a significant barrier, and ensures greater reliability and continuity of service. Uploaders have been developed to simplify the metadata documentation process. The web-based data client eliminates the need for installing a program to be used as a client and works across all computer operating systems. The services provided by the WDC is a foundation for big data use, re-use, and meta-analyses. Using data transmission standards enables far more effective data sharing and discovery; standards used by the WDC are part of a global set of standards that should enable scientists to access unprecedented amount of data to address larger-scale research questions than was previously possible. A central mission of the WDC is to ensure these services meet the needs of the water science community and are effective at advancing water science.

  16. Standard care quality determines treatment outcomes in control groups of HAART-adherence intervention studies: implications for the interpretation and comparison of intervention effects.

    PubMed

    de Bruin, Marijn; Viechtbauer, Wolfgang; Hospers, Harm J; Schaalma, Herman P; Kok, Gerjo

    2009-11-01

    Clinical trials of behavioral interventions seek to enhance evidence-based health care. However, in case the quality of standard care provided to control conditions varies between studies and affects outcomes, intervention effects cannot be directly interpreted or compared. The objective of the present study was to examine whether standard care quality (SCQ) could be reliably assessed, varies between studies of highly active antiretroviral HIV-adherence interventions, and is related to the proportion of patients achieving an undetectable viral load ("success rate"). Databases were searched for relevant articles. Authors of selected studies retrospectively completed a checklist with standard care activities, which were coded to compute SCQ scores. The relationship between SCQ and the success rates was examined using meta-regression. Cronbach's alpha, variability in SCQ, and relation between SCQ and success rate. Reliability of the SCQ instrument was high (Cronbach's alpha = .91). SCQ scores ranged from 3.7 to 27.8 (total range = 0-30) and were highly predictive of success rate (p = .002). Variation in SCQ provided to control groups may substantially influence effect sizes of behavior change interventions. Future trials should therefore assess and report SCQ, and meta-analyses should control for variability in SCQ, thereby producing more accurate estimates of the effectiveness of behavior change interventions. PsycINFO Database Record (c) 2009 APA, all rights reserved.

  17. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed amount of oxygen, or of cation (using an analysis in element or oxide weight-%); this latter includes re-calculation of H2O/CO2 based on stoichiometry, and oxygen correction for F and Cl. Another option offers a list of any available standards and possible peak or background interferences for a series of elements. (3) "X-ray maps" lists the different setups recommended for element mapping using WDS, and a map calculator to facilitate maps setups and to estimate the total mapping time. (4) "X-ray data" lists all x-ray lines for a specific element (K, L, M, absorption edges, and satellite peaks) in term of energy, wavelength and peak position. A check for possible interferences on peak or background is also possible. Theoretical x-ray peak positions for each crystal are calculated based on the 2d spacing of each crystal and the wavelength of each line. (5) "Agenda" menu displays the reservation dates for each month and for each EMP lab defined. It also offers a reservation request option, this request being sent by email to the EMP manager for approval. (6) Finally, "Admin" is password restricted, and contains all necessary options to manage the database through user-friendly forms. The installation of this database is made easy and knowledge of HTML, PHP, or MySQL is unnecessary to install, configure, manage, or use it. A working database is accessible at http://cub.geoloweb.ch.

  18. Algorithms for Computation of Fundamental Properties of Seawater. Endorsed by Unesco/SCOR/ICES/IAPSO Joint Panel on Oceanographic Tables and Standards and SCOR Working Group 51. Unesco Technical Papers in Marine Science, No. 44.

    ERIC Educational Resources Information Center

    Fofonoff, N. P.; Millard, R. C., Jr.

    Algorithms for computation of fundamental properties of seawater, based on the practicality salinity scale (PSS-78) and the international equation of state for seawater (EOS-80), are compiled in the present report for implementing and standardizing computer programs for oceanographic data processing. Sample FORTRAN subprograms and tables are given…

  19. A novel computer system for the evaluation of nasolabial morphology, symmetry and aesthetics after cleft lip and palate treatment. Part 2: Comparative anthropometric analysis of patients with repaired unilateral complete cleft lip and palate and healthy individuals.

    PubMed

    Pietruski, Piotr; Majak, Marcin; Pawlowska, Elzbieta; Skiba, Adam; Antoszewski, Boguslaw

    2017-04-01

    The aim of this study was to use a novel system, 'Analyse It Doc' (A.I.D.) for a complex anthropometric analysis of the nasolabial region in patients with repaired unilateral complete cleft lip and palate and in healthy individuals. A set of standardized facial photographs in frontal, lateral and submental view have been taken in 50 non-cleft controls (mean age 20.6 years) and 42 patients with repaired unilateral complete cleft and palate (mean age 19.57 years). Then, based on linear, angular and area measurements taken from the digital photographs with the aid of the A.I.D. system, a photogrammetric analysis of intergroup differences in nasolabial morphology and symmetry was conducted. Patients with cleft lip and palate differed from the controls in terms of more than half of analysed angular measurements and proportion indices derived from linear and area measurements of the nasolabial region. The findings presented herein imply that despite primary surgical repair, patients with unilateral complete cleft lip and palate still show some degree of nasolabial dysmorphology. Furthermore, the study demonstrated that the novel computer system is suitable for a reliable, simple and time-efficient anthropometric analysis in a clinical setting. Copyright © 2017 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  20. Utilization of Solar Dynamics Observatory space weather digital image data for comparative analysis with application to Baryon Oscillation Spectroscopic Survey

    NASA Astrophysics Data System (ADS)

    Shekoyan, V.; Dehipawala, S.; Liu, Ernest; Tulsee, Vivek; Armendariz, R.; Tremberger, G.; Holden, T.; Marchese, P.; Cheung, T.

    2012-10-01

    Digital solar image data is available to users with access to standard, mass-market software. Many scientific projects utilize the Flexible Image Transport System (FITS) format, which requires specialized software typically used in astrophysical research. Data in the FITS format includes photometric and spatial calibration information, which may not be useful to researchers working with self-calibrated, comparative approaches. This project examines the advantages of using mass-market software with readily downloadable image data from the Solar Dynamics Observatory for comparative analysis over with the use of specialized software capable of reading data in the FITS format. Comparative analyses of brightness statistics that describe the solar disk in the study of magnetic energy using algorithms included in mass-market software have been shown to give results similar to analyses using FITS data. The entanglement of magnetic energy associated with solar eruptions, as well as the development of such eruptions, has been characterized successfully using mass-market software. The proposed algorithm would help to establish a publicly accessible, computing network that could assist in exploratory studies of all FITS data. The advances in computer, cell phone and tablet technology could incorporate such an approach readily for the enhancement of high school and first-year college space weather education on a global scale. Application to ground based data such as that contained in the Baryon Oscillation Spectroscopic Survey is discussed.

  1. How to design a single-cell RNA-sequencing experiment: pitfalls, challenges and perspectives.

    PubMed

    Dal Molin, Alessandra; Di Camillo, Barbara

    2018-01-31

    The sequencing of the transcriptome of single cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types in heterogeneous cell populations or for the study of stochastic gene expression. In recent years, various experimental methods and computational tools for analysing single-cell RNA-sequencing data have been proposed. However, most of them are tailored to different experimental designs or biological questions, and in many cases, their performance has not been benchmarked yet, thus increasing the difficulty for a researcher to choose the optimal single-cell transcriptome sequencing (scRNA-seq) experiment and analysis workflow. In this review, we aim to provide an overview of the current available experimental and computational methods developed to handle single-cell RNA-sequencing data and, based on their peculiarities, we suggest possible analysis frameworks depending on specific experimental designs. Together, we propose an evaluation of challenges and open questions and future perspectives in the field. In particular, we go through the different steps of scRNA-seq experimental protocols such as cell isolation, messenger RNA capture, reverse transcription, amplification and use of quantitative standards such as spike-ins and Unique Molecular Identifiers (UMIs). We then analyse the current methodological challenges related to preprocessing, alignment, quantification, normalization, batch effect correction and methods to control for confounding effects. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Efficacy of combined antiparasitic therapy with praziquantel and albendazole for neurocysticercosis: a double-blind, randomised controlled trial

    PubMed Central

    Garcia, Hector H; Gonzales, Isidro; Lescano, Andres G; Bustos, Javier A; Zimic, Mirko; Escalante, Diego; Saavedra, Herbert; Gavidia, Martin; Rodriguez, Lourdes; Najar, Enrique; Umeres, Hugo; Pretell, E Javier

    2014-01-01

    Summary Background Neurocysticercosis causes a substantial burden of seizure disorders worldwide. Treatment with either praziquantel or albendazole has suboptimum efficacy. We aimed to establish whether combination of these drugs would increase cysticidal efficacy and whether complete cyst resolution results in fewer seizures. We added an increased dose albendazole group to establish a potential effect of increased albendazole concentrations. Methods In this double-blind, placebo-controlled, phase 3 trial, patients with viable intraparenchymal neurocysticercosis were randomly assigned to receive 10 days of combined albendazole (15 mg/kg per day) plus praziquantel (50 mg/kg per day), standard albendazole (15 mg/kg per day), or increased dose albendazole (22·5 mg/kg per day). Randomisation was done with a computer generated schedule balanced within four strata based on number of cysts and concomitant antiepileptic drug. Patients and investigators were masked to group assignment. The primary outcome was complete cyst resolution on 6-month MRI. Enrolment was stopped after interim analysis because of parasiticidal superiority of one treatment group. Analysis excluded patients lost to follow-up before the 6-month MRI. This trial is registered with ClinicalTrials.gov, number NCT00441285. Findings Between March 3, 2010 and Nov 14, 2011, 124 patients were randomly assigned to study groups (41 to receive combined albendazole plus praziquantel [39 analysed], 43 standard albendazole [41 analysed], and 40 increased albendazole [38 analysed]). 25 (64%) of 39 patients in the combined treatment group had complete resolution of brain cysts compared with 15 (37%) of 41 patients in the standard albendazole group (rate ratio [RR] 1·75, 95% CI 1·10–2·79, p=0·014). 20 (53%) of 38 patients in the increased albendazole group had complete cyst resolution at 6-month MRI compared with 15 (37%) of 41 patients in the standard albendazole group (RR 1·44, 95% CI 0·87–2·38, p=0·151). No significant differences in adverse events were reported between treatment groups (18 in combined treatment group, 11 in standard albendazole group, and 19 in increased albendazole group). Interpretation Combination of albendazole plus praziquantel increases the parasiticidal effect in patients with multiple brain cysticercosis cysts without increased side-effects. A more efficacious parasiticidal regime without increased treatment-associated side-effects should improve the treatment and long term prognosis of patients with neurocysticercosis. Funding National Institute of Neurological Disorders and Stroke (NINDS), National Institutes of Health. PMID:24999157

  3. Efficacy of combined antiparasitic therapy with praziquantel and albendazole for neurocysticercosis: a double-blind, randomised controlled trial.

    PubMed

    Garcia, Hector H; Gonzales, Isidro; Lescano, Andres G; Bustos, Javier A; Zimic, Mirko; Escalante, Diego; Saavedra, Herbert; Gavidia, Martin; Rodriguez, Lourdes; Najar, Enrique; Umeres, Hugo; Pretell, E Javier

    2014-08-01

    Neurocysticercosis causes a substantial burden of seizure disorders worldwide. Treatment with either praziquantel or albendazole has suboptimum efficacy. We aimed to establish whether combination of these drugs would increase cysticidal efficacy and whether complete cyst resolution results in fewer seizures. We added an increased dose albendazole group to establish a potential effect of increased albendazole concentrations. In this double-blind, placebo-controlled, phase 3 trial, patients with viable intraparenchymal neurocysticercosis were randomly assigned to receive 10 days of combined albendazole (15 mg/kg per day) plus praziquantel (50 mg/kg per day), standard albendazole (15 mg/kg per day), or increased dose albendazole (22·5 mg/kg per day). Randomisation was done with a computer generated schedule balanced within four strata based on number of cysts and concomitant antiepileptic drug. Patients and investigators were masked to group assignment. The primary outcome was complete cyst resolution on 6-month MRI. Enrolment was stopped after interim analysis because of parasiticidal superiority of one treatment group. Analysis excluded patients lost to follow-up before the 6-month MRI. This trial is registered with ClinicalTrials.gov, number NCT00441285. Between March 3, 2010 and Nov 14, 2011, 124 patients were randomly assigned to study groups (41 to receive combined albendazole plus praziquantel [39 analysed], 43 standard albendazole [41 analysed], and 40 increased albendazole [38 analysed]). 25 (64%) of 39 patients in the combined treatment group had complete resolution of brain cysts compared with 15 (37%) of 41 patients in the standard albendazole group (rate ratio [RR] 1·75, 95% CI 1·10-2·79, p=0·014). 20 (53%) of 38 patients in the increased albendazole group had complete cyst resolution at 6-month MRI compared with 15 (37%) of 41 patients in the standard albendazole group (RR 1·44, 95% CI 0·87-2·38, p=0·151). No significant differences in adverse events were reported between treatment groups (18 in combined treatment group, 11 in standard albendazole group, and 19 in increased albendazole group). Combination of albendazole plus praziquantel increases the parasiticidal effect in patients with multiple brain cysticercosis cysts without increased side-effects. A more efficacious parasiticidal regime without increased treatment-associated side-effects should improve the treatment and long term prognosis of patients with neurocysticercosis. National Institute of Neurological Disorders and Stroke (NINDS), National Institutes of Health. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. A Methodological Analysis of Randomized Clinical Trials of Computer-Assisted Therapies for Psychiatric Disorders: Toward Improved Standards for an Emerging Field

    PubMed Central

    Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.

    2013-01-01

    Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689

  5. Numerical Viscous Flow Analysis of an Advanced Semispan Diamond-Wing Model at High-Life Conditions

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Biedron, R. T.; Luckring, J. M.

    2002-01-01

    Turbulent Navier-Stokes computational results are presented for an advanced diamond wing semispan model at low speed, high-lift conditions. The numerical results are obtained in support of a wind-tunnel test that was conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center. The model incorporated a generic fuselage and was mounted on the tunnel sidewall using a constant width standoff. The analyses include: (1) the numerical simulation of the NTF empty, tunnel flow characteristics; (2) semispan high-lift model with the standoff in the tunnel environment; (3) semispan high-lift model with the standoff and viscous sidewall in free air; and (4) semispan high-lift model without the standoff in free air. The computations were performed at conditions that correspond to a nominal approach and landing configuration. The wing surface pressure distributions computed for the model in both the tunnel and in free air agreed well with the corresponding experimental data and they both indicated small increments due to the wall interference effects. However, the wall interference effects were found to be more pronounced in the total measured and the computed lift, drag and pitching moment due to standard induced up-flow effects. Although the magnitudes of the computed forces and moment were slightly off compared to the measured data, the increments due the wall interference effects were predicted well. The numerical predictions are also presented on the combined effects of the tunnel sidewall boundary layer and the standoff geometry on the fuselage fore-body pressure distributions and the resulting impact on the overall configuration longitudinal aerodynamic characteristics.

  6. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  7. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  8. When does a physical system compute?

    PubMed

    Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv

    2014-09-08

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution . We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.

  9. When does a physical system compute?

    PubMed Central

    Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv

    2014-01-01

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245

  10. Computational biology for ageing

    PubMed Central

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  11. The influence of computational assumptions on analysing abdominal aortic aneurysm haemodynamics.

    PubMed

    Ene, Florentina; Delassus, Patrick; Morris, Liam

    2014-08-01

    The variation in computational assumptions for analysing abdominal aortic aneurysm haemodynamics can influence the desired output results and computational cost. Such assumptions for abdominal aortic aneurysm modelling include static/transient pressures, steady/transient flows and rigid/compliant walls. Six computational methods and these various assumptions were simulated and compared within a realistic abdominal aortic aneurysm model with and without intraluminal thrombus. A full transient fluid-structure interaction was required to analyse the flow patterns within the compliant abdominal aortic aneurysms models. Rigid wall computational fluid dynamics overestimates the velocity magnitude by as much as 40%-65% and the wall shear stress by 30%-50%. These differences were attributed to the deforming walls which reduced the outlet volumetric flow rate for the transient fluid-structure interaction during the majority of the systolic phase. Static finite element analysis accurately approximates the deformations and von Mises stresses when compared with transient fluid-structure interaction. Simplifying the modelling complexity reduces the computational cost significantly. In conclusion, the deformation and von Mises stress can be approximately found by static finite element analysis, while for compliant models a full transient fluid-structure interaction analysis is required for acquiring the fluid flow phenomenon. © IMechE 2014.

  12. General review of the MOSTAS computer code for wind turbines

    NASA Technical Reports Server (NTRS)

    Dungundji, J.; Wendell, J. H.

    1981-01-01

    The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.

  13. Cost-effectiveness of a motivational intervention for alcohol-involved youth in a hospital emergency department.

    PubMed

    Neighbors, Charles J; Barnett, Nancy P; Rohsenow, Damaris J; Colby, Suzanne M; Monti, Peter M

    2010-05-01

    Brief interventions in the emergency department targeting risk-taking youth show promise to reduce alcohol-related injury. This study models the cost-effectiveness of a motivational interviewing-based intervention relative to brief advice to stop alcohol-related risk behaviors (standard care). Average cost-effectiveness ratios were compared between conditions. In addition, a cost-utility analysis examined the incremental cost of motivational interviewing per quality-adjusted life year gained. Microcosting methods were used to estimate marginal costs of motivational interviewing and standard care as well as two methods of patient screening: standard emergency-department staff questioning and proactive outreach by counseling staff. Average cost-effectiveness ratios were computed for drinking and driving, injuries, vehicular citations, and negative social consequences. Using estimates of the marginal effect of motivational interviewing in reducing drinking and driving, estimates of traffic fatality risk from drinking-and-driving youth, and national life tables, the societal costs per quality-adjusted life year saved by motivational interviewing relative to standard care were also estimated. Alcohol-attributable traffic fatality risks were estimated using national databases. Intervention costs per participant were $81 for standard care, $170 for motivational interviewing with standard screening, and $173 for motivational interviewing with proactive screening. The cost-effectiveness ratios for motivational interviewing were more favorable than standard care across all study outcomes and better for men than women. The societal cost per quality-adjusted life year of motivational interviewing was $8,795. Sensitivity analyses indicated that results were robust in terms of variability in parameter estimates. This brief intervention represents a good societal investment compared with other commonly adopted medical interventions.

  14. Applications of cost-effectiveness methodologies in behavioral medicine.

    PubMed

    Kaplan, Robert M; Groessl, Erik J

    2002-06-01

    In 1996, the Panel on Cost-Effectiveness in Health and Medicine developed standards for cost-effectiveness analysis. The standards include the use of a societal perspective, that treatments be evaluated in comparison with the best available alternative (rather than with no care at all), and that health benefits be expressed in standardized units. Guidelines for cost accounting were also offered. Among 24,562 references on cost-effectiveness in Medline between 1995 and 2000, only a handful were relevant to behavioral medicine. Only 19 studies published between 1983 and 2000 met criteria for further evaluation. Among analyses that were reported, only 2 studies were found consistent with the Panel's criteria for high-quality analyses, although more recent studies were more likely to meet methodological standards. There are substantial opportunities to advance behavioral medicine by performing standardized cost-effectiveness analyses.

  15. Semi-analytical discontinuous Galerkin finite element method for the calculation of dispersion properties of guided waves in plates.

    PubMed

    Hebaz, Salah-Eddine; Benmeddour, Farouk; Moulin, Emmanuel; Assaad, Jamal

    2018-01-01

    The development of reliable guided waves inspection systems is conditioned by an accurate knowledge of their dispersive properties. The semi-analytical finite element method has been proven to be very practical for modeling wave propagation in arbitrary cross-section waveguides. However, when it comes to computations on complex geometries to a given accuracy, it still has a major drawback: the high consumption of resources. Recently, discontinuous Galerkin finite element method (DG-FEM) has been found advantageous over the standard finite element method when applied as well in the frequency domain. In this work, a high-order method for the computation of Lamb mode characteristics in plates is proposed. The problem is discretised using a class of DG-FEM, namely, the interior penalty methods family. The analytical validation is performed through the homogeneous isotropic case with traction-free boundary conditions. Afterwards, functionally graded material plates are analysed and a numerical example is presented. It was found that the obtained results are in good agreement with those found in the literature.

  16. Exploiting the chaotic behaviour of atmospheric models with reconfigurable architectures

    NASA Astrophysics Data System (ADS)

    Russell, Francis P.; Düben, Peter D.; Niu, Xinyu; Luk, Wayne; Palmer, T. N.

    2017-12-01

    Reconfigurable architectures are becoming mainstream: Amazon, Microsoft and IBM are supporting such architectures in their data centres. The computationally intensive nature of atmospheric modelling is an attractive target for hardware acceleration using reconfigurable computing. Performance of hardware designs can be improved through the use of reduced-precision arithmetic, but maintaining appropriate accuracy is essential. We explore reduced-precision optimisation for simulating chaotic systems, targeting atmospheric modelling, in which even minor changes in arithmetic behaviour will cause simulations to diverge quickly. The possibility of equally valid simulations having differing outcomes means that standard techniques for comparing numerical accuracy are inappropriate. We use the Hellinger distance to compare statistical behaviour between reduced-precision CPU implementations to guide reconfigurable designs of a chaotic system, then analyse accuracy, performance and power efficiency of the resulting implementations. Our results show that with only a limited loss in accuracy corresponding to less than 10% uncertainty in input parameters, the throughput and energy efficiency of a single-precision chaotic system implemented on a Xilinx Virtex-6 SX475T Field Programmable Gate Array (FPGA) can be more than doubled.

  17. Upper quadrant postural changes of school children in response to interaction with different information technologies.

    PubMed

    Briggs, Andrew; Straker, Leon; Greig, Alison

    2004-06-10

    The objective of this study was to quantitatively analyse the sitting posture of school children interacting with both old (book) and new (laptop and desktop computers) information technologies to test the hypothesis that posture is effected by the type of information technology (IT) used. A mixed model design was used to test the effect of IT type (within subjects) and age and gender (between subjects). The sitting posture of 32 children aged 4-17 years was measured whilst they read from a book, laptop, and desktop computer at a standard school chair and desk. Video images were captured and then digitized to calculate mean angles for head tilt, neck flexion, trunk flexion, and gaze angle. Posture was found to be influenced by IT type (p < 0.001), age (p < 0.001) and gender (p = 0.024) and significantly correlated to the stature of the participants. Measurement of resting posture and the maximal range of motion of the upper and lower cervical spines in the sagittal plane was also undertaken. The biophysical impact and the suitability of the three different information technologies are discussed.

  18. Usefulness of hemocytometer as a counting chamber in a computer assisted sperm analyzer (CASA)

    USGS Publications Warehouse

    Eljarah, A.; Chandler, J.; Jenkins, J.A.; Chenevert, J.; Alcanal, A.

    2013-01-01

    Several methods are used to determine sperm cell concentration, such as the haemocytometer, spectrophotometer, electronic cell counter and computer-assisted semen analysers (CASA). The utility of CASA systems has been limited due to the lack of characterization of individual systems and the absence of standardization among laboratories. The aims of this study were to: 1) validate and establish setup conditions for the CASA system utilizing the haemocytometer as a counting chamber, and 2) compare the different methods used for the determination of sperm cell concentration in bull semen. Two ejaculates were collected and the sperm cell concentration was determined using spectrophotometer and haemocytometer. For the Hamilton-Thorn method, the haemocytometer was used as a counting chamber. Sperm concentration was determined three times per ejaculate samples. A difference (P 0.05) or between the haemocytometer count and the spectrophotometer. Based on the results of this study, we concluded that the haemocytometer can be used in computerized semen analysis systems as a substitute for the commercially available disposable counting chambers, therefore avoiding disadvantageous high costs and slower procedures.

  19. Density conversion factor determined using a cone-beam computed tomography unit NewTom QR-DVT 9000.

    PubMed

    Lagravère, M O; Fang, Y; Carey, J; Toogood, R W; Packota, G V; Major, P W

    2006-11-01

    The purpose of this study was to determine a conversion coefficient for Hounsfield Units (HU) to material density (g cm(-3)) obtained from cone-beam computed tomography (CBCT-NewTom QR-DVT 9000) data. Six cylindrical models of materials with different densities were made and scanned using the NewTom QR-DVT 9000 Volume Scanner. The raw data were converted into DICOM format and analysed using Merge eFilm and AMIRA to determine the HU of different areas of the models. There was no significant difference (P = 0.846) between the HU given by each piece of software. A linear regression was performed using the density, rho (g cm(-3)), as the dependent variable in terms of the HU (H). The regression equation obtained was rho = 0.002H-0.381 with an R2 value of 0.986. The standard error of the estimation is 27.104 HU in the case of the Hounsfield Units and 0.064 g cm(-3) in the case of density. CBCT provides an effective option for determination of material density expressed as Hounsfield Units.

  20. Standard practices for the implementation of computer software

    NASA Technical Reports Server (NTRS)

    Irvine, A. P. (Editor)

    1978-01-01

    A standard approach to the development of computer program is provided that covers the file cycle of software development from the planning and requirements phase through the software acceptance testing phase. All documents necessary to provide the required visibility into the software life cycle process are discussed in detail.

  1. 77 FR 53962 - Technical Standard Order (TSO)-C68a, Airborne Automatic Dead Reckoning Computer Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ..., Airborne Automatic Dead Reckoning Computer Equipment Utilizing Aircraft Heading and Doppler Ground Speed.... ACTION: Notice of cancellation of Technical Standard Order (TSO)-C68a, Airborne Automatic Dead Reckoning... . SUPPLEMENTARY INFORMATION: Background Doppler radar is a semiautomatic self-contained dead reckoning navigation...

  2. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION... Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses, with clarifications... Electrical and Electronic Engineers (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration...

  3. Computer Technology Standards of Learning for Virginia's Public Schools

    ERIC Educational Resources Information Center

    Virginia Department of Education, 2005

    2005-01-01

    The Computer/Technology Standards of Learning identify and define the progressive development of essential knowledge and skills necessary for students to access, evaluate, use, and create information using technology. They provide a framework for technology literacy and demonstrate a progression from physical manipulation skills for the use of…

  4. Relative resilience to noise of standard and sequential approaches to measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Gallagher, C. B.; Ferraro, A.

    2018-05-01

    A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.

  5. Testing in semiparametric models with interaction, with applications to gene-environment interactions.

    PubMed

    Maity, Arnab; Carroll, Raymond J; Mammen, Enno; Chatterjee, Nilanjan

    2009-01-01

    Motivated from the problem of testing for genetic effects on complex traits in the presence of gene-environment interaction, we develop score tests in general semiparametric regression problems that involves Tukey style 1 degree-of-freedom form of interaction between parametrically and non-parametrically modelled covariates. We find that the score test in this type of model, as recently developed by Chatterjee and co-workers in the fully parametric setting, is biased and requires undersmoothing to be valid in the presence of non-parametric components. Moreover, in the presence of repeated outcomes, the asymptotic distribution of the score test depends on the estimation of functions which are defined as solutions of integral equations, making implementation difficult and computationally taxing. We develop profiled score statistics which are unbiased and asymptotically efficient and can be performed by using standard bandwidth selection methods. In addition, to overcome the difficulty of solving functional equations, we give easy interpretations of the target functions, which in turn allow us to develop estimation procedures that can be easily implemented by using standard computational methods. We present simulation studies to evaluate type I error and power of the method proposed compared with a naive test that does not consider interaction. Finally, we illustrate our methodology by analysing data from a case-control study of colorectal adenoma that was designed to investigate the association between colorectal adenoma and the candidate gene NAT2 in relation to smoking history.

  6. Determination of biogenic amines in chocolate by ion chromatographic separation and pulsed integrated amperometric detection with implemented wave-form at Au disposable electrode.

    PubMed

    Pastore, Paolo; Favaro, Gabriella; Badocco, Denis; Tapparo, Andrea; Cavalli, Silvano; Saccani, Giovanna

    2005-12-09

    A rapid and selective cation exchange chromatographic method coupled to integrated pulsed amperometric detection (PAD) has been developed to quantify biogenic amines in chocolate. The method is based on gradient elution of aqueous methanesulfonic acid with post column addition of strong base to obtain suitable conditions for amperometric detection. A potential waveform able to keep long time performance of the Au disposable electrode was set up. Total analysis time is less than 20min. Concentration levels of dopamine, serotonin, tyramine, histamine and 2-phenylethylamine were measured, after extraction with perchloric acid from 2g samples previously defatted twice with petroleum ether. The method was used to determine the analytes in chocolate real matrices and their quantification was made with standard addition method. Only dopamine, histamine and serotonin were found in the analysed real samples. Repeatabilities of their signals, computed on their amounts in the real samples, were 5% for all of them. Repeatabilities of tyramine and phenethylamine were relative to standard additions to real samples (close to 1mg/l in the extract) and were 7 and 3%, respectively. Detection limits were computed with the 3s of the baseline noise combined with the calibration plot regression parameters. They were satisfactorily low for all amines: 3mg/kg for dopamine, 2mg/kg for tyramine, 1mg/kg for histamine, 2mg/kg for serotonin, 3mg/kg for 2-phenylethylamine.

  7. Resolution, sensitivity, and in vivo application of high-resolution computed tomography for titanium-coated polymethyl methacrylate (PMMA) dental implants.

    PubMed

    Cuijpers, Vincent M J I; Jaroszewicz, Jacub; Anil, Sukumaran; Al Farraj Aldosari, Abdullah; Walboomers, X Frank; Jansen, John A

    2014-03-01

    The aims of this study were (i) to determine the spatial resolution and sensitivity of micro- versus nano-computed tomography (CT) techniques and (ii) to validate micro- versus nano-CT in a dog dental implant model, comparative to histological analysis. To determine spatial resolution and sensitivity, standardized reference samples containing standardized nano- and microspheres were prepared in polymer and ceramic matrices. Thereafter, 10 titanium-coated polymer dental implants (3.2 mm in Ø by 4 mm in length) were placed in the mandible of Beagle dogs. Both micro- and nano-CT, as well as histological analyses, were performed. The reference samples confirmed the high resolution of the nano-CT system, which was capable of revealing sub-micron structures embedded in radiodense matrices. The dog implantation study and subsequent statistical analysis showed equal values for bone area and bone-implant contact measurements between micro-CT and histology. However, because of the limited sample size and field of view, nano-CT was not rendering reliable data representative of the entire bone-implant specimen. Micro-CT analysis is an efficient tool to quantitate bone healing parameters at the bone-implant interface, especially when using titanium-coated PMMA implants. Nano-CT is not suitable for such quantification, but reveals complementary morphological information rivaling histology, yet with the advantage of a 3D visualization. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  8. Ground-water quality in east-central New Jersey, and a plan for sampling networks

    USGS Publications Warehouse

    Harriman, D.A.; Sargent, B.P.

    1985-01-01

    Groundwater quality was evaluated in seven confined aquifers and the water table aquifer in east-central New Jersey based on 237 analyses of samples collected in 1981-82, and 225 older analyses. Investigation of the effect of land use on water quality and several sampling network proposals for the region are reported. Generally, water in the confined aquifers is of satisfactory quality for human consumption and most other uses. Iron (Fe) and manganese (Mn) concentrations exceed U.S. EPA drinking water standards in some wells screened in the Potomac-Raritan-Magothy aquifer system. Sodium (Na) concentrations in samples from three wells more than 800 ft deep in the Englishtown aquifer exceed the standard. Iron and Mn concentrations in this aquifer may also exceed the standards. Iron concentrations in the Wenonah-Mount Laurel aquifer exceed the standard. Based on 15 analyses of water from the Vincetown aquifer, Mn is the only constituent that exceeds the drinking water standard. In the Manasquan aquifer, 4 of the 16 Na determinations exceed the standard, and 8 of 16 Fe determinations exceed the standard. Water quality in the Atlantic City 800-ft sand is generally satisfactory. However, 12 Fe and 1 of 12 Mn determinations exceed the standards. For the Rio Grande water-bearing zone, 1 of 3 Fe determinations exceed the standard. The Kirkwood-Cohansey aquifer system (the water table aquifer) was the most thoroughly sampled (249 chemical analyses from 209 wells). Dissolved solids, chloride, Fe, nitrate, and Mn concentrations exceed drinking water standards in some areas. The results of chi-square tests of constituent distributions based on analyses from 158 wells in the water table aquifer indicate that calcium is higher in industrial and commercial areas; and Mg, chloride, and nitrate-plus-nitrite is higher in residential areas. (Author 's abstract)

  9. Model-Invariant Hybrid Computations of Separated Flows for RCA Standard Test Cases

    NASA Technical Reports Server (NTRS)

    Woodruff, Stephen

    2016-01-01

    NASA's Revolutionary Computational Aerosciences (RCA) subproject has identified several smooth-body separated flows as standard test cases to emphasize the challenge these flows present for computational methods and their importance to the aerospace community. Results of computations of two of these test cases, the NASA hump and the FAITH experiment, are presented. The computations were performed with the model-invariant hybrid LES-RANS formulation, implemented in the NASA code VULCAN-CFD. The model- invariant formulation employs gradual LES-RANS transitions and compensation for model variation to provide more accurate and efficient hybrid computations. Comparisons revealed that the LES-RANS transitions employed in these computations were sufficiently gradual that the compensating terms were unnecessary. Agreement with experiment was achieved only after reducing the turbulent viscosity to mitigate the effect of numerical dissipation. The stream-wise evolution of peak Reynolds shear stress was employed as a measure of turbulence dynamics in separated flows useful for evaluating computations.

  10. EUPAN enables pan-genome studies of a large number of eukaryotic genomes.

    PubMed

    Hu, Zhiqiang; Sun, Chen; Lu, Kuang-Chen; Chu, Xixia; Zhao, Yue; Lu, Jinyuan; Shi, Jianxin; Wei, Chaochun

    2017-08-01

    Pan-genome analyses are routinely carried out for bacteria to interpret the within-species gene presence/absence variations (PAVs). However, pan-genome analyses are rare for eukaryotes due to the large sizes and higher complexities of their genomes. Here we proposed EUPAN, a eukaryotic pan-genome analysis toolkit, enabling automatic large-scale eukaryotic pan-genome analyses and detection of gene PAVs at a relatively low sequencing depth. In the previous studies, we demonstrated the effectiveness and high accuracy of EUPAN in the pan-genome analysis of 453 rice genomes, in which we also revealed widespread gene PAVs among individual rice genomes. Moreover, EUPAN can be directly applied to the current re-sequencing projects primarily focusing on single nucleotide polymorphisms. EUPAN is implemented in Perl, R and C ++. It is supported under Linux and preferred for a computer cluster with LSF and SLURM job scheduling system. EUPAN together with its standard operating procedure (SOP) is freely available for non-commercial use (CC BY-NC 4.0) at http://cgm.sjtu.edu.cn/eupan/index.html . ccwei@sjtu.edu.cn or jianxin.shi@sjtu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. [Trauma and accident documentation in Germany compared with elsewhere in Europe].

    PubMed

    Probst, C; Richter, M; Haasper, C; Lefering, R; Otte, D; Oestern, H J; Krettek, C; Hüfner, T

    2008-07-01

    The role of trauma documentation has grown continuously since the 1970s. Prevention and management of injuries were adapted according to the results of many analyses. Since 1993 there have been two different trauma databases in Germany: the German trauma registry (TR) and the database of the Accident Research Unit (UFO). Modern computer applications improved the data processing. Our study analysed the pros and cons of each system and compared them with those of our European neighbours. We compared the TR and the UFO databases with respect to aims and goals, advantages and disadvantages, and current status. Results were reported as means +/- standard errors of the mean. The level of significance was set at P<0.05. There were differences between the two databases concerning number and types of items, aims and goals, and demographics. The TR documents care for severely injured patients and the clinical course of different types of accidents. The UFO describes traffic accidents, accident conditions, and interrelations. The German and British systems are similar, and the French system shows interesting differences. The German trauma documentation systems focus on different points. Therefore both can be used for substantiated analyses of different hypotheses. Certain intersections of both databases may help to answer very special questions in the future.

  12. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  13. MATISSE: A novel tool to access, visualize and analyse data from planetary exploration missions

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Palomba, E.; Giommi, P.; Antonelli, L. A.

    2016-04-01

    The increasing number and complexity of planetary exploration space missions require new tools to access, visualize and analyse data to improve their scientific return. ASI Science Data Center (ASDC) addresses this request with the web-tool MATISSE (Multi-purpose Advanced Tool for the Instruments of the Solar System Exploration), allowing the visualization of single observation or real-time computed high-order products, directly projected on the three-dimensional model of the selected target body. Using MATISSE it will be no longer needed to download huge quantity of data or to write down a specific code for every instrument analysed, greatly encouraging studies based on joint analysis of different datasets. In addition the extremely high-resolution output, to be used offline with a Python-based free software, together with the files to be read with specific GIS software, makes it a valuable tool to further process the data at the best spatial accuracy available. MATISSE modular structure permits addition of new missions or tasks and, thanks to dedicated future developments, it would be possible to make it compliant to the Planetary Virtual Observatory standards currently under definition. In this context the recent development of an interface to the NASA ODE REST API by which it is possible to access to public repositories is set.

  14. The paddle move commonly used in magic tricks as a means for analysing the perceptual limits of combined motion trajectories.

    PubMed

    Hergovich, Andreas; Gröbl, Kristian; Carbon, Claus-Christian

    2011-01-01

    Following Gustav Kuhn's inspiring technique of using magicians' acts as a source of insight into cognitive sciences, we used the 'paddle move' for testing the psychophysics of combined movement trajectories. The paddle move is a standard technique in magic consisting of a combined rotating and tilting movement. Careful control of the mutual speed parameters of the two movements makes it possible to inhibit the perception of the rotation, letting the 'magic' effect emerge--a sudden change of the tilted object. By using 3-D animated computer graphics we analysed the interaction of different angular speeds and the object shape/size parameters in evoking this motion disappearance effect. An angular speed of 540 degrees s(-1) (1.5 rev. s(-1)) sufficed to inhibit the perception of the rotary movement with the smallest object showing the strongest effect. 90.7% of the 172 participants were not able to perceive the rotary movement at an angular speed of 1125 degrees s(-1) (3.125 rev. s(-1)). Further analysis by multiple linear regression revealed major influences on the effectiveness of the magic trick of object height and object area, demonstrating the applicability of analysing key factors of magic tricks to reveal limits of the perceptual system.

  15. Calculation of the ELISA's cut-off based on the change-point analysis method for detection of Trypanosoma cruzi infection in Bolivian dogs in the absence of controls.

    PubMed

    Lardeux, Frédéric; Torrico, Gino; Aliaga, Claudia

    2016-07-04

    In ELISAs, sera of individuals infected by Trypanosoma cruzi show absorbance values above a cut-off value. The cut-off is generally computed by means of formulas that need absorbance readings of negative (and sometimes positive) controls, which are included in the titer plates amongst the unknown samples. When no controls are available, other techniques should be employed such as change-point analysis. The method was applied to Bolivian dog sera processed by ELISA to diagnose T. cruzi infection. In each titer plate, the change-point analysis estimated a step point which correctly discriminated among known positive and known negative sera, unlike some of the six usual cut-off formulas tested. To analyse the ELISAs results, the change-point method was as good as the usual cut-off formula of the form "mean + 3 standard deviation of negative controls". Change-point analysis is therefore an efficient alternative method to analyse ELISA absorbance values when no controls are available.

  16. JCPDS-ICDD Research Associateship (Cooperative Program with NBS/NIST)

    PubMed Central

    Wong-Ng, W.; McMurdie, H. F.; Hubbard, C. R.; Mighell, A. D.

    2001-01-01

    The Research Associateship program of the Joint Committee on Powder Diffraction-International Centre for Diffraction Data (JCPDS-ICDD, now known as the ICDD) at NBS/NIST was a long standing (over 35 years) successful industry-government cooperation. The main mission of the Associateship was to publish high quality x-ray reference patterns to be included in the Powder Diffraction File (PDF). The PDF is a continuing compilation of patterns gathered from many sources, compiled and published by the ICDD. As a result of this collaboration, more than 1500 high quality powder diffraction patterns, which have had a significant impact on the scientific community, were reported. In addition, various research collaborations with NBS/NIST also led to the development of several standard reference materials (SRMs) for instrument calibration and quantitative analyses, and computer software for data collection, calibration, reduction, for the editorial process of powder pattern publication, analysis of powder data, and for quantitative analyses. This article summarizes information concerning the JCPDS-ICDD organization, the Powder Diffraction File (PDF), history and accomplishments of the JCPDS-ICDD Research Associateship. PMID:27500061

  17. Prediction of Undsteady Flows in Turbomachinery Using the Linearized Euler Equations on Deforming Grids

    NASA Technical Reports Server (NTRS)

    Clark, William S.; Hall, Kenneth C.

    1994-01-01

    A linearized Euler solver for calculating unsteady flows in turbomachinery blade rows due to both incident gusts and blade motion is presented. The model accounts for blade loading, blade geometry, shock motion, and wake motion. Assuming that the unsteadiness in the flow is small relative to the nonlinear mean solution, the unsteady Euler equations can be linearized about the mean flow. This yields a set of linear variable coefficient equations that describe the small amplitude harmonic motion of the fluid. These linear equations are then discretized on a computational grid and solved using standard numerical techniques. For transonic flows, however, one must use a linear discretization which is a conservative linearization of the non-linear discretized Euler equations to ensure that shock impulse loads are accurately captured. Other important features of this analysis include a continuously deforming grid which eliminates extrapolation errors and hence, increases accuracy, and a new numerically exact, nonreflecting far-field boundary condition treatment based on an eigenanalysis of the discretized equations. Computational results are presented which demonstrate the computational accuracy and efficiency of the method and demonstrate the effectiveness of the deforming grid, far-field nonreflecting boundary conditions, and shock capturing techniques. A comparison of the present unsteady flow predictions to other numerical, semi-analytical, and experimental methods shows excellent agreement. In addition, the linearized Euler method presented requires one or two orders-of-magnitude less computational time than traditional time marching techniques making the present method a viable design tool for aeroelastic analyses.

  18. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  19. Analysis and meta-analysis of single-case designs: an introduction.

    PubMed

    Shadish, William R

    2014-04-01

    The last 10 years have seen great progress in the analysis and meta-analysis of single-case designs (SCDs). This special issue includes five articles that provide an overview of current work on that topic, including standardized mean difference statistics, multilevel models, Bayesian statistics, and generalized additive models. Each article analyzes a common example across articles and presents syntax or macros for how to do them. These articles are followed by commentaries from single-case design researchers and journal editors. This introduction briefly describes each article and then discusses several issues that must be addressed before we can know what analyses will eventually be best to use in SCD research. These issues include modeling trend, modeling error covariances, computing standardized effect size estimates, assessing statistical power, incorporating more accurate models of outcome distributions, exploring whether Bayesian statistics can improve estimation given the small samples common in SCDs, and the need for annotated syntax and graphical user interfaces that make complex statistics accessible to SCD researchers. The article then discusses reasons why SCD researchers are likely to incorporate statistical analyses into their research more often in the future, including changing expectations and contingencies regarding SCD research from outside SCD communities, changes and diversity within SCD communities, corrections of erroneous beliefs about the relationship between SCD research and statistics, and demonstrations of how statistics can help SCD researchers better meet their goals. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. Space shuttle propulsion parameter estimation using optional estimation techniques

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A regression analyses on tabular aerodynamic data provided. A representative aerodynamic model for coefficient estimation. It also reduced the storage requirements for the "normal' model used to check out the estimation algorithms. The results of the regression analyses are presented. The computer routines for the filter portion of the estimation algorithm and the :"bringing-up' of the SRB predictive program on the computer was developed. For the filter program, approximately 54 routines were developed. The routines were highly subsegmented to facilitate overlaying program segments within the partitioned storage space on the computer.

  1. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  2. Computational Environment for Modeling and Analysing Network Traffic Behaviour Using the Divide and Recombine Framework

    ERIC Educational Resources Information Center

    Barthur, Ashrith

    2016-01-01

    There are two essential goals of this research. The first goal is to design and construct a computational environment that is used for studying large and complex datasets in the cybersecurity domain. The second goal is to analyse the Spamhaus blacklist query dataset which includes uncovering the properties of blacklisted hosts and understanding…

  3. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    PubMed

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  4. Effects of mouse slant and desktop position on muscular and postural stresses, subject preference and performance in women aged 18-40 years.

    PubMed

    Gaudez, Clarisse; Cail, François

    2016-11-01

    This study compared muscular and postural stresses, performance and subject preference in women aged 18-40 years using a standard mouse, a vertical mouse and a slanted mouse in three different computer workstation positions. Four tasks were analysed: pointing, pointing-clicking, pointing-clicking-dragging and grasping-pointing the mouse after typing. Flexor digitorum superficialis (FDS) and extensor carpi radialis (ECR) activities were greater using the standard mouse compared to the vertical or slanted mouse. In all cases, the wrist position remained in the comfort zone recommended by standard ISO 11228-3. The vertical mouse was less comfortable and more difficult to use than the other two mice. FDS and ECR activities, shoulder abduction and wrist extension were greater when the mouse was placed next to the keyboard. Performance and subject preference were better with the unrestricted mouse positioning on the desktop. Grasping the mouse after typing was the task that caused the greatest stress. Practitioner Summary: In women, the slanted mouse and the unrestricted mouse positioning on the desktop provide a good blend of stresses, performance and preference. Unrestricted mouse positioning requires no keyboard, which is rare in practice. Placing the mouse in front of the keyboard, rather than next to it, reduced the physical load.

  5. PET-CT in oncological patients: analysis of informal care costs in cost-benefit assessment.

    PubMed

    Orlacchio, Antonio; Ciarrapico, Anna Micaela; Schillaci, Orazio; Chegai, Fabrizio; Tosti, Daniela; D'Alba, Fabrizio; Guazzaroni, Manlio; Simonetti, Giovanni

    2014-04-01

    The authors analysed the impact of nonmedical costs (travel, loss of productivity) in an economic analysis of PET-CT (positron-emission tomography-computed tomography) performed with standard contrast-enhanced CT protocols (CECT). From October to November 2009, a total of 100 patients referred to our institute were administered a questionnaire to evaluate the nonmedical costs of PET-CT. In addition, the medical costs (equipment maintenance and depreciation, consumables and staff) related to PET-CT performed with CECT and PET-CT with low-dose nonenhanced CT and separate CECT were also estimated. The medical costs were 919.3 euro for PET-CT with separate CECT, and 801.3 euro for PET-CT with CECT. Therefore, savings of approximately 13% are possible. Moreover, savings in nonmedical costs can be achieved by reducing the number of hospital visits required by patients undergoing diagnostic imaging. Nonmedical costs heavily affect patients' finances as well as having an indirect impact on national health expenditure. Our results show that PET-CT performed with standard dose CECT in a single session provides benefits in terms of both medical and nonmedical costs.

  6. Using CT Data to Improve the Quantitative Analysis of 18F-FBB PET Neuroimages

    PubMed Central

    Segovia, Fermín; Sánchez-Vañó, Raquel; Górriz, Juan M.; Ramírez, Javier; Sopena-Novales, Pablo; Testart Dardel, Nathalie; Rodríguez-Fernández, Antonio; Gómez-Río, Manuel

    2018-01-01

    18F-FBB PET is a neuroimaging modality that is been increasingly used to assess brain amyloid deposits in potential patients with Alzheimer's disease (AD). In this work, we analyze the usefulness of these data to distinguish between AD and non-AD patients. A dataset with 18F-FBB PET brain images from 94 subjects diagnosed with AD and other disorders was evaluated by means of multiple analyses based on t-test, ANOVA, Fisher Discriminant Analysis and Support Vector Machine (SVM) classification. In addition, we propose to calculate amyloid standardized uptake values (SUVs) using only gray-matter voxels, which can be estimated using Computed Tomography (CT) images. This approach allows assessing potential brain amyloid deposits along with the gray matter loss and takes advantage of the structural information provided by most of the scanners used for PET examination, which allow simultaneous PET and CT data acquisition. The results obtained in this work suggest that SUVs calculated according to the proposed method allow AD and non-AD subjects to be more accurately differentiated than using SUVs calculated with standard approaches. PMID:29930505

  7. Drought Analysis for Kuwait Using Standardized Precipitation Index

    PubMed Central

    2014-01-01

    Implementation of adequate measures to assess and monitor droughts is recognized as a major matter challenging researchers involved in water resources management. The objective of this study is to assess the hydrologic drought characteristics from the historical rainfall records of Kuwait with arid environment by employing the criterion of Standardized Precipitation Index (SPI). A wide range of monthly total precipitation data from January 1967 to December 2009 is used for the assessment. The computation of the SPI series is performed for intermediate- and long-time scales of 3, 6, 12, and 24 months. The drought severity and duration are also estimated. The bivariate probability distribution for these two drought characteristics is constructed by using Clayton copula. It has been shown that the drought SPI series for the time scales examined have no systematic trend component but a seasonal pattern related to rainfall data. The results are used to perform univariate and bivariate frequency analyses for the drought events. The study will help evaluating the risk of future droughts in the region, assessing their consequences on economy, environment, and society, and adopting measures for mitigating the effect of droughts. PMID:25386598

  8. Recognizing and exploring the right questions with climate data: An example of better understanding ENSO in climate projections

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.; Buja, L.; Gutowski, W. J., Jr.; Halley-Gotway, J.; Kaatz, L.; Yates, D. N.

    2017-12-01

    Coordinated, multi-model climate change projection archives have already led to a flourishing of new climate impact applications. Collections and online tools for the computation of derived indicators have attracted many non-specialist users and decision-makers and facilitated for them the exploration of potential future weather and climate changes on their systems. Guided by a set of standardized steps and analyses, many can now use model output and determine basic model-based changes. But because each application and decision-context is different, the question remains if such a small collection of standardized tools can faithfully and comprehensively represent the critical physical context of change? We use the example of the El Niño - Southern Oscillation, the largest and most broadly recognized mode of variability in the climate system, to explore the difference in impact contexts between a quasi-blind, protocol-bound and a flexible, scientifically guided use of climate information. More use oriented diagnostics of the model-data as well as different strategies for getting data into decision environments are explored.

  9. Functional Competency Development Model for Academic Personnel Based on International Professional Qualification Standards in Computing Field

    ERIC Educational Resources Information Center

    Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon

    2016-01-01

    This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…

  10. Computed tomography guided localization of clinically occult breast carcinoma-the ''N'' skin guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopans, D.B.; Meyer, J.E.

    1982-10-01

    Standard computed tomography (CT) can be used for the three-dimensional localization of clinically occult suspicious breast lesions whose exact position cannot be determined by standard mammographic views. A method is described that facilitates accurate preoperative needle localization using CT guidance, once the position of these lesions is defined.

  11. 77 FR 13294 - Announcing Approval of Federal Information Processing Standard (FIPS) Publication 180-4, Secure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-06

    ... hash algorithms in many computer network applications. On February 11, 2011, NIST published a notice in... Information Security Management Act (FISMA) of 2002 (Pub. L. 107-347), the Secretary of Commerce is authorized to approve Federal Information Processing Standards (FIPS). NIST activities to develop computer...

  12. A computational imaging target specific detectivity metric

    NASA Astrophysics Data System (ADS)

    Preece, Bradley L.; Nehmetallah, George

    2017-05-01

    Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.

  13. Impacts: NIST Building and Fire Research Laboratory (technical and societal)

    NASA Astrophysics Data System (ADS)

    Raufaste, N. J.

    1993-08-01

    The Building and Fire Research Laboratory (BFRL) of the National Institute of Standards and Technology (NIST) is dedicated to the life cycle quality of constructed facilities. The report describes major effects of BFRL's program on building and fire research. Contents of the document include: structural reliability; nondestructive testing of concrete; structural failure investigations; seismic design and construction standards; rehabilitation codes and standards; alternative refrigerants research; HVAC simulation models; thermal insulation; residential equipment energy efficiency; residential plumbing standards; computer image evaluation of building materials; corrosion-protection for reinforcing steel; prediction of the service lives of building materials; quality of construction materials laboratory testing; roofing standards; simulating fires with computers; fire safety evaluation system; fire investigations; soot formation and evolution; cone calorimeter development; smoke detector standards; standard for the flammability of children's sleepwear; smoldering insulation fires; wood heating safety research; in-place testing of concrete; communication protocols for building automation and control systems; computer simulation of the properties of concrete and other porous materials; cigarette-induced furniture fires; carbon monoxide formation in enclosure fires; halon alternative fire extinguishing agents; turbulent mixing research; materials fire research; furniture flammability testing; standard for the cigarette ignition resistance of mattresses; support of navy firefighter trainer program; and using fire to clean up oil spills.

  14. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  15. A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters

    USGS Publications Warehouse

    Johnsson, P.A.; Lord, D.G.

    1987-01-01

    ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)

  16. Standard chemotherapy with or without bevacizumab in advanced ovarian cancer: quality-of-life outcomes from the International Collaboration on Ovarian Neoplasms (ICON7) phase 3 randomised trial

    PubMed Central

    Stark, Dan; Nankivell, Matthew; Pujade-Lauraine, Eric; Kristensen, Gunnar; Elit, Lorraine; Stockler, Martin; Hilpert, Felix; Cervantes, Andrés; Brown, Julia; Lanceley, Anne; Velikova, Galina; Sabate, Eduardo; Pfisterer, Jacobus; Carey, Mark S; Beale, Philip; Qian, Wendi; Swart, Ann Marie; Oza, Amit; Perren, Tim

    2013-01-01

    Summary Background In the Gynecologic Cancer Intergroup International Collaboration on Ovarian Neoplasms 7 (ICON7) trial, bevacizumab improved progression-free survival in patients with ovarian cancer when used in combination with first-line chemotherapy and as a single-drug continuation treatment for 18 cycles. In a preliminary analysis of a high-risk subset of patients, there was also an improvement in overall survival. This study aims to describe the health-related quality-of-life (QoL) outcomes from ICON7. Methods ICON7 is a randomised, multicentre, open-label phase 3 trial. Between Dec 18, 2006, and Feb 16, 2009, after a surgical procedure aiming to debulk the disease, women with International Federation of Gynecology and Obstetrics (FIGO) high-risk stage I–IV epithelial ovarian cancer were randomly allocated (1:1) by computer program and block randomisation to receive either six cycles of standard chemotherapy (total 18 weeks) with carboplatin (area under the curve 5 or 6) and paclitaxel (175 mg/m2) alone or with bevacizumab (7·5 mg/kg) given intravenously with chemotherapy and continued as a single drug thereafter (total 54 weeks). The primary QoL endpoint was global QoL from the European Organisation for Research and Treatment of Cancer quality-of-life questionnaire–core 30 at week 54, analysed by ANOVA and adjusted for baseline score. Analyses were by intention to treat. The ICON7 trial has completed recruitment and remains in follow-up. This study is registered, number ISRCTN91273375. Findings 764 women were randomly assigned to the standard chemotherapy group and 764 to the bevacizumab group. At baseline, 684 (90%) of women in the standard chemotherapy group and 691 (90%) of those in the bevacizumab group had completed QoL questionnaires. At week 54, 502 (66%) women in the bevacizumab group and 388 (51%) women in the standard chemotherapy group provided QoL data. Overall, the mean global QoL score improved during chemotherapy by 7·2 points (SD 24·4) when analysed for all women with data at baseline and week 18. The mean global QoL score at 54 weeks was higher in the standard chemotherapy group than in the bevacizumab group (76·1 [SD 18·2] vs 69·7 [19·1] points; difference 6·4 points, 95% CI 3·7–9·0, p<0·0001). Interpretation Bevacizumab continuation treatment seems to be associated with a small but clinically significant decrement in QoL compared with standard treatment for women with ovarian cancer. The trade-off between the prolongation of progression-free survival and the quality of that period of time needs to be considered in clinical practice when making treatment decisions. Funding Roche and the National Institute for Health Research through the UK National Cancer Research Network. PMID:23333117

  17. Computer Programming Languages for Health Care

    PubMed Central

    O'Neill, Joseph T.

    1979-01-01

    This paper advocates the use of standard high level programming languages for medical computing. It recommends that U.S. Government agencies having health care missions implement coordinated policies that encourage the use of existing standard languages and the development of new ones, thereby enabling them and the medical computing community at large to share state-of-the-art application programs. Examples are based on a model that characterizes language and language translator influence upon the specification, development, test, evaluation, and transfer of application programs.

  18. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Method of computing coverage. 80.771 Section 80... STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method of computing coverage. Compute the +17 dBu contour as follows: (a) Determine the effective antenna...

  19. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    PubMed

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  20. CAD-based stand-alone spacecraft radiation exposure analysis system: An application of the early man-tended Space Station

    NASA Technical Reports Server (NTRS)

    Appleby, M. H.; Golightly, M. J.; Hardy, A. C.

    1993-01-01

    Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.

  1. Control of Technology Transfer at JPL

    NASA Technical Reports Server (NTRS)

    Oliver, Ronald

    2006-01-01

    Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software

  2. Applicability of IHE/Continua components for PHR systems: learning from experiences.

    PubMed

    Urbauer, Philipp; Sauermann, Stefan; Frohner, Matthias; Forjan, Mathias; Pohn, Birgit; Mense, Alexander

    2015-04-01

    Capturing personal health data using smartphones, PCs or other devices, and the reuse of the data in personal health records (PHR) is becoming more and more attractive for modern health-conscious populations. This paper analyses interoperability specifications targeting standards-based communication of computer systems and personal health devices (e.g. blood pressure monitor) in healthcare from initiatives like Integrating the Healthcare Enterprise (IHE) and Continua Health Alliance driven by industry and healthcare professionals. Furthermore it identifies certain contradictions and gaps in the specifications and suggests possible solutions. Despite these shortcomings, the specifications allow fully functional implementations of PHR systems. Henceforth, both big business and small and medium-sized enterprises (SMEs) can actively contribute to the widespread use of large-scale interoperable PHR systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Technical Report Series on Global Modeling and Data Assimilation. Volume 12; Comparison of Satellite Global Rainfall Algorithms

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Chang, Alfred T. C.; Chiu, Long S.

    1997-01-01

    Seventeen months of rainfall data (August 1987-December 1988) from nine satellite rainfall algorithms (Adler, Chang, Kummerow, Prabhakara, Huffman, Spencer, Susskind, and Wu) were analyzed to examine the uncertainty of satellite-derived rainfall estimates. The variability among algorithms, measured as the standard deviation computed from the ensemble of algorithms, shows regions of high algorithm variability tend to coincide with regions of high rain rates. Histograms of pattern correlation (PC) between algorithms suggest a bimodal distribution, with separation at a PC-value of about 0.85. Applying this threshold as a criteria for similarity, our analyses show that algorithms using the same sensor or satellite input tend to be similar, suggesting the dominance of sampling errors in these satellite estimates.

  4. Newton-based optimization for Kullback-Leibler nonnegative tensor factorizations

    DOE PAGES

    Plantenga, Todd; Kolda, Tamara G.; Hansen, Samantha

    2015-04-30

    Tensor factorizations with nonnegativity constraints have found application in analysing data from cyber traffic, social networks, and other areas. We consider application data best described as being generated by a Poisson process (e.g. count data), which leads to sparse tensors that can be modelled by sparse factor matrices. In this paper, we investigate efficient techniques for computing an appropriate canonical polyadic tensor factorization based on the Kullback–Leibler divergence function. We propose novel subproblem solvers within the standard alternating block variable approach. Our new methods exploit structure and reformulate the optimization problem as small independent subproblems. We employ bound-constrained Newton andmore » quasi-Newton methods. Finally, we compare our algorithms against other codes, demonstrating superior speed for high accuracy results and the ability to quickly find sparse solutions.« less

  5. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  6. Human factors in the Naval Air Systems Command: Computer based training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seamster, T.L.; Snyder, C.E.; Terranova, M.

    1988-01-01

    Military standards applied to the private sector contracts have a substantial effect on the quality of Computer Based Training (CBT) systems procured for the Naval Air Systems Command. This study evaluated standards regulating the following areas in CBT development and procurement: interactive training systems, cognitive task analysis, and CBT hardware. The objective was to develop some high-level recommendations for evolving standards that will govern the next generation of CBT systems. One of the key recommendations is that there be an integration of the instructional systems development, the human factors engineering, and the software development standards. Recommendations were also made formore » task analysis and CBT hardware standards. (9 refs., 3 figs.)« less

  7. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    USGS Publications Warehouse

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  8. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  9. Modeling Human-Computer Decision Making with Covariance Structure Analysis.

    ERIC Educational Resources Information Center

    Coovert, Michael D.; And Others

    Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…

  10. Computer Instructional Aids for Undergraduate Control Education.

    ERIC Educational Resources Information Center

    Volz, Richard A.; And Others

    Engineering is coming to rely more and more heavily upon the computer for computations, analyses, and graphic displays which aid the design process. A general purpose simulation system, the Time-shared Automatic Control Laboratory (TACL), and a set of computer-aided design programs, Control Oriented Interactive Graphic Analysis and Design…

  11. A Multilingual Approach to Analysing Standardized Test Results: Immigrant Primary School Children and the Role of Languages Spoken in a Bi-/Multilingual Community

    ERIC Educational Resources Information Center

    De Angelis, Gessica

    2014-01-01

    The present study adopts a multilingual approach to analysing the standardized test results of primary school immigrant children living in the bi-/multilingual context of South Tyrol, Italy. The standardized test results are from the Invalsi test administered across Italy in 2009/2010. In South Tyrol, several languages are spoken on a daily basis…

  12. Quantifying relative importance: Computing standardized effects in models with binary outcomes

    USGS Publications Warehouse

    Grace, James B.; Johnson, Darren; Lefcheck, Jonathan S.; Byrnes, Jarrett E.K.

    2018-01-01

    Results from simulation studies show that both the LT and OE methods of standardization support a similarly-broad range of coefficient comparisons. The LT method estimates effects that reflect underlying latent-linear propensities, while the OE method computes a linear approximation for the effects of predictors on binary responses. The contrast between assumptions for the two methods is reflected in persistently weaker standardized effects associated with OE standardization. Reliance on standard deviations for standardization (the traditional approach) is critically examined and shown to introduce substantial biases when predictors are non-Gaussian. The use of relevant ranges in place of standard deviations has the capacity to place LT and OE standardized coefficients on a more comparable scale. As ecologists address increasingly complex hypotheses, especially those that involve comparing the influences of different controlling factors (e.g., top-down versus bottom-up or biotic versus abiotic controls), comparable coefficients become a necessary component for evaluations.

  13. Automated standardization technique for an inductively-coupled plasma emission spectrometer

    USGS Publications Warehouse

    Garbarino, John R.; Taylor, Howard E.

    1982-01-01

    The manifold assembly subsystem described permits real-time computer-controlled standardization and quality control of a commercial inductively-coupled plasma atomic emission spectrometer. The manifold assembly consists of a branch-structured glass manifold, a series of microcomputer-controlled solenoid valves, and a reservoir for each standard. Automated standardization involves selective actuation of each solenoid valve that permits a specific mixed standard solution to be pumped to the nebulizer of the spectrometer. Quality control is based on the evaluation of results obtained for a mixed standard containing 17 analytes, that is measured periodically with unknown samples. An inaccurate standard evaluation triggers restandardization of the instrument according to a predetermined protocol. Interaction of the computer-controlled manifold assembly hardware with the spectrometer system is outlined. Evaluation of the automated standardization system with respect to reliability, simplicity, flexibility, and efficiency is compared to the manual procedure. ?? 1982.

  14. User Guide to RockJock - A Program for Determining Quantitative Mineralogy from X-Ray Diffraction Data

    USGS Publications Warehouse

    Eberl, D.D.

    2003-01-01

    RockJock is a computer program that determines quantitative mineralogy in powdered samples by comparing the integrated X-ray diffraction (XRD) intensities of individual minerals in complex mixtures to the intensities of an internal standard. Analysis without an internal standard (standardless analysis) also is an option. This manual discusses how to prepare and X-ray samples and mineral standards for these types of analyses and describes the operation of the program. Carefully weighed samples containing an internal standard (zincite) are ground in a McCrone mill. Randomly oriented preparations then are X-rayed, and the X-ray data are entered into the RockJock program. Minerals likely to be present in the sample are chosen from a list of standards, and the calculation is begun. The program then automatically fits the sum of stored XRD patterns of pure standard minerals (the calculated pattern) to the measured pattern by varying the fraction of each mineral standard pattern, using the Solver function in Microsoft Excel to minimize a degree of fit parameter between the calculated and measured pattern. The calculation analyzes the pattern (usually 20 to 65 degrees two-theta) to find integrated intensities for the minerals. Integrated intensities for each mineral then are determined from the proportion of each mineral standard pattern required to give the best fit. These integrated intensities then are compared to the integrated intensity of the internal standard, and the weight percentages of the minerals are calculated. The results are presented as a list of minerals with their corresponding weight percent. To some extent, the quality of the analysis can be checked because each mineral is analyzed independently, and, therefore, the sum of the analysis should approach 100 percent. Also, the method has been shown to give good results with artificial mixtures. The program is easy to use, but does require an understanding of mineralogy, of X-ray diffraction practice, and an elementary knowledge of the Excel program.

  15. Defining objective clusters for rabies virus sequences using affinity propagation clustering

    PubMed Central

    Fischer, Susanne; Freuling, Conrad M.; Pfaff, Florian; Bodenhofer, Ulrich; Höper, Dirk; Fischer, Mareike; Marston, Denise A.; Fooks, Anthony R.; Mettenleiter, Thomas C.; Conraths, Franz J.; Homeier-Bachmann, Timo

    2018-01-01

    Rabies is caused by lyssaviruses, and is one of the oldest known zoonoses. In recent years, more than 21,000 nucleotide sequences of rabies viruses (RABV), from the prototype species rabies lyssavirus, have been deposited in public databases. Subsequent phylogenetic analyses in combination with metadata suggest geographic distributions of RABV. However, these analyses somewhat experience technical difficulties in defining verifiable criteria for cluster allocations in phylogenetic trees inviting for a more rational approach. Therefore, we applied a relatively new mathematical clustering algorythm named ‘affinity propagation clustering’ (AP) to propose a standardized sub-species classification utilizing full-genome RABV sequences. Because AP has the advantage that it is computationally fast and works for any meaningful measure of similarity between data samples, it has previously been applied successfully in bioinformatics, for analysis of microarray and gene expression data, however, cluster analysis of sequences is still in its infancy. Existing (516) and original (46) full genome RABV sequences were used to demonstrate the application of AP for RABV clustering. On a global scale, AP proposed four clusters, i.e. New World cluster, Arctic/Arctic-like, Cosmopolitan, and Asian as previously assigned by phylogenetic studies. By combining AP with established phylogenetic analyses, it is possible to resolve phylogenetic relationships between verifiably determined clusters and sequences. This workflow will be useful in confirming cluster distributions in a uniform transparent manner, not only for RABV, but also for other comparative sequence analyses. PMID:29357361

  16. The moderating role of emotional competence in suicidal ideation among Chinese university students.

    PubMed

    Kwok, Sylvia Y C L

    2014-04-01

    To explore the relationship among perceived family functioning, emotional competence and suicidal ideation and to examine the moderating role of emotional competence in suicidal ideation. Previous studies have highlighted that poor family relationships and emotional symptoms are significant predictors of suicidal ideation. However, the roles of perceived family functioning and emotional competence in predicting suicidal ideation have not been given adequate attention. A cross-sectional survey using convenience sampling. A questionnaire was administered to 302 university students from February-April in 2011 in Hong Kong. The means, standard deviations and Cronbach's alphas of the variables were computed. Pearson correlation analyses and hierarchical regression analyses were performed. Hierarchical regression analyses showed that perceived high family functioning and emotional competence were significant negative predictors of suicidal ideation. Further analyses showed that parental concern, parental control and creative use of emotions were significant predictors of suicidal ideation. Emotional competence, specifically creative use of emotions, was found to moderate the relationship between perceived family functioning and suicidal ideation. The findings support the family ecological framework and provide evidence for emotional competence as a resilience factor that buffers low family functioning on suicidal ideation. Suggested measures to decrease suicidal ideation include enhancing parental concern, lessening parental control, developing students' awareness, regulation and management of their own emotions, fostering empathy towards others' emotional expression, enhancing social skills in sharing and influencing others' emotions and increasing the positive use of emotions for the evaluation and generation of new ideas. © 2013 John Wiley & Sons Ltd.

  17. Standard payload computer for the international space station

    NASA Astrophysics Data System (ADS)

    Knott, Karl; Taylor, Chris; Koenig, Horst; Schlosstein, Uwe

    1999-01-01

    This paper describes the development and application of a Standard PayLoad Computer (SPLC) which is being applied by the majority of ESA payloads accommodated on the International Space Station (ISS). The strategy of adopting of a standard computer leads to a radical rethink in the payload data handling procurement process. Traditionally, this has been based on a proprietary development with repeating costs for qualification, spares, expertise and maintenance for each new payload. Implementations have also tended to be unique with very little opportunity for reuse or utilisation of previous developments. While this may to some extent have been justified for short duration one-off missions, the availability of a standard, long term space infrastructure calls for a quite different approach. To support a large number of concurrent payloads, the ISS implementation relies heavily on standardisation, and this is particularly true in the area of payloads. Physical accommodation, data interfaces, protocols, component quality, operational requirements and maintenance including spares provisioning must all conform to a common set of standards. The data handling system and associated computer used by each payload must also comply with these common requirements, and thus it makes little sense to instigate multiple developments for the same task. The opportunity exists to provide a single computer suitable for all payloads, but with only a one-off development and qualification cost. If this is combined with the benefits of multiple procurement, centralised spares and maintenance, there is potential for great savings to be made by all those concerned in the payload development process. In response to the above drivers, the SPLC is based on the following concepts: • A one-off development and qualification process • A modular computer, configurable according to the payload developer's needs from a list of space-qualified items • An `open system' which may be added to by payload developers • Core software providing a suite of common communications services including a verified protocol implementation required to communicate with the ISS • A standardized ground support equipment and accompanying software development environment • The use of commercial hardware and software standards and products.

  18. [Hardware for graphics systems].

    PubMed

    Goetz, C

    1991-02-01

    In all personal computer applications, be it for private or professional use, the decision of which "brand" of computer to buy is of central importance. In the USA Apple computers are mainly used in universities, while in Europe computers of the so-called "industry standard" by IBM (or clones thereof) have been increasingly used for many years. Independently of any brand name considerations, the computer components purchased must meet the current (and projected) needs of the user. Graphic capabilities and standards, processor speed, the use of co-processors, as well as input and output devices such as "mouse", printers and scanners are discussed. This overview is meant to serve as a decision aid. Potential users are given a short but detailed summary of current technical features.

  19. Computers and Data Processing. Subject Bibliography.

    ERIC Educational Resources Information Center

    United States Government Printing Office, Washington, DC.

    This annotated bibliography of U.S. Government publications contains over 90 entries on topics including telecommunications standards, U.S. competitiveness in high technology industries, computer-related crimes, capacity management of information technology systems, the application of computer technology in the Soviet Union, computers and…

  20. A PCA-Based method for determining craniofacial relationship and sexual dimorphism of facial shapes.

    PubMed

    Shui, Wuyang; Zhou, Mingquan; Maddock, Steve; He, Taiping; Wang, Xingce; Deng, Qingqiong

    2017-11-01

    Previous studies have used principal component analysis (PCA) to investigate the craniofacial relationship, as well as sex determination using facial factors. However, few studies have investigated the extent to which the choice of principal components (PCs) affects the analysis of craniofacial relationship and sexual dimorphism. In this paper, we propose a PCA-based method for visual and quantitative analysis, using 140 samples of 3D heads (70 male and 70 female), produced from computed tomography (CT) images. There are two parts to the method. First, skull and facial landmarks are manually marked to guide the model's registration so that dense corresponding vertices occupy the same relative position in every sample. Statistical shape spaces of the skull and face in dense corresponding vertices are constructed using PCA. Variations in these vertices, captured in every principal component (PC), are visualized to observe shape variability. The correlations of skull- and face-based PC scores are analysed, and linear regression is used to fit the craniofacial relationship. We compute the PC coefficients of a face based on this craniofacial relationship and the PC scores of a skull, and apply the coefficients to estimate a 3D face for the skull. To evaluate the accuracy of the computed craniofacial relationship, the mean and standard deviation of every vertex between the two models are computed, where these models are reconstructed using real PC scores and coefficients. Second, each PC in facial space is analysed for sex determination, for which support vector machines (SVMs) are used. We examined the correlation between PCs and sex, and explored the extent to which the choice of PCs affects the expression of sexual dimorphism. Our results suggest that skull- and face-based PCs can be used to describe the craniofacial relationship and that the accuracy of the method can be improved by using an increased number of face-based PCs. The results show that the accuracy of the sex classification is related to the choice of PCs. The highest sex classification rate is 91.43% using our method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Elastic-plastic finite-element analyses of thermally cycled double-edge wedge specimens

    NASA Technical Reports Server (NTRS)

    Kaufman, A.; Hunt, L. E.

    1982-01-01

    Elastic-plastic stress-strain analyses were performed for double-edge wedge specimens subjected to thermal cycling in fluidized beds at 316 and 1088 C. Four cases involving different nickel-base alloys (IN 100, Mar M-200, NASA TAZ-8A, and Rene 80) were analyzed by using the MARC nonlinear, finite element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions obtained by using the NASTRAN and ISO3DQ computer programs. Equivalent total strain ranges at the critical locations calculated by elastic analyses agreed within 3 percent with those calculated from elastic-plastic analyses. The elastic analyses always resulted in compressive mean stresses at the critical locations. However, elastic-plastic analyses showed tensile mean stresses for two of the four alloys and an increase in the compressive mean stress for the highest plastic strain case.

  2. Comparative performance analysis for computer aided lung nodule detection and segmentation on ultra-low-dose vs. standard-dose CT

    NASA Astrophysics Data System (ADS)

    Wiemker, Rafael; Rogalla, Patrik; Opfer, Roland; Ekin, Ahmet; Romano, Valentina; Bülow, Thomas

    2006-03-01

    The performance of computer aided lung nodule detection (CAD) and computer aided nodule volumetry is compared between standard-dose (70-100 mAs) and ultra-low-dose CT images (5-10 mAs). A direct quantitative performance comparison was possible, since for each patient both an ultra-low-dose and a standard-dose CT scan were acquired within the same examination session. The data sets were recorded with a multi-slice CT scanner at the Charite university hospital Berlin with 1 mm slice thickness. Our computer aided nodule detection and segmentation algorithms were deployed on both ultra-low-dose and standard-dose CT data without any dose-specific fine-tuning or preprocessing. As a reference standard 292 nodules from 20 patients were visually identified, each nodule both in ultra-low-dose and standard-dose data sets. The CAD performance was analyzed by virtue of multiple FROC curves for different lower thresholds of the nodule diameter. For nodules with a volume-equivalent diameter equal or larger than 4 mm (149 nodules pairs), we observed a detection rate of 88% at a median false positive rate of 2 per patient in standard-dose images, and 86% detection rate in ultra-low-dose images, also at 2 FPs per patient. Including even smaller nodules equal or larger than 2 mm (272 nodules pairs), we observed a detection rate of 86% in standard-dose images, and 84% detection rate in ultra-low-dose images, both at a rate of 5 FPs per patient. Moreover, we observed a correlation of 94% between the volume-equivalent nodule diameter as automatically measured on ultra-low-dose versus on standard-dose images, indicating that ultra-low-dose CT is also feasible for growth-rate assessment in follow-up examinations. The comparable performance of lung nodule CAD in ultra-low-dose and standard-dose images is of particular interest with respect to lung cancer screening of asymptomatic patients.

  3. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  4. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    ERIC Educational Resources Information Center

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  5. Regulation of flow computers for the measurement of biofuels

    NASA Astrophysics Data System (ADS)

    Almeida, R. O.; Aguiar Júnior, E. A.; Costa-Felix, R. P. B.

    2018-03-01

    This article aims to discuss the need to develop a standard or regulation applicable to flow computers in the measurement of biofuels. International standards and recommendations are presented which are possibly adequate to fill this gap and at the end of the article a way is proposed to obtain a single document on the subject.

  6. Detailed T1-Weighted Profiles from the Human Cortex Measured in Vivo at 3 Tesla MRI.

    PubMed

    Ferguson, Bart; Petridou, Natalia; Fracasso, Alessio; van den Heuvel, Martijn P; Brouwer, Rachel M; Hulshoff Pol, Hilleke E; Kahn, René S; Mandl, René C W

    2018-04-01

    Studies into cortical thickness in psychiatric diseases based on T1-weighted MRI frequently report on aberrations in the cerebral cortex. Due to limitations in image resolution for studies conducted at conventional MRI field strengths (e.g. 3 Tesla (T)) this information cannot be used to establish which of the cortical layers may be implicated. Here we propose a new analysis method that computes one high-resolution average cortical profile per brain region extracting myeloarchitectural information from T1-weighted MRI scans that are routinely acquired at a conventional field strength. To assess this new method, we acquired standard T1-weighted scans at 3 T and compared them with state-of-the-art ultra-high resolution T1-weighted scans optimised for intracortical myelin contrast acquired at 7 T. Average cortical profiles were computed for seven different brain regions. Besides a qualitative comparison between the 3 T scans, 7 T scans, and results from literature, we tested if the results from dynamic time warping-based clustering are similar for the cortical profiles computed from 7 T and 3 T data. In addition, we quantitatively compared cortical profiles computed for V1, V2 and V7 for both 7 T and 3 T data using a priori information on their relative myelin concentration. Although qualitative comparisons show that at an individual level average profiles computed for 7 T have more pronounced features than 3 T profiles the results from the quantitative analyses suggest that average cortical profiles computed from T1-weighted scans acquired at 3 T indeed contain myeloarchitectural information similar to profiles computed from the scans acquired at 7 T. The proposed method therefore provides a step forward to study cortical myeloarchitecture in vivo at conventional magnetic field strength both in health and disease.

  7. Use of Noncontrast Computed Tomography and Computed Tomographic Perfusion in Predicting Intracerebral Hemorrhage After Intravenous Alteplase Therapy.

    PubMed

    Batchelor, Connor; Pordeli, Pooneh; d'Esterre, Christopher D; Najm, Mohamed; Al-Ajlan, Fahad S; Boesen, Mari E; McDougall, Connor; Hur, Lisa; Fainardi, Enrico; Shankar, Jai Jai Shiva; Rubiera, Marta; Khaw, Alexander V; Hill, Michael D; Demchuk, Andrew M; Sajobi, Tolulope T; Goyal, Mayank; Lee, Ting-Yim; Aviv, Richard I; Menon, Bijoy K

    2017-06-01

    Intracerebral hemorrhage is a feared complication of intravenous alteplase therapy in patients with acute ischemic stroke. We explore the use of multimodal computed tomography in predicting this complication. All patients were administered intravenous alteplase with/without intra-arterial therapy. An age- and sex-matched case-control design with classic and conditional logistic regression techniques was chosen for analyses. Outcome was parenchymal hemorrhage on 24- to 48-hour imaging. Exposure variables were imaging (noncontrast computed tomography hypoattenuation degree, relative volume of very low cerebral blood volume, relative volume of cerebral blood flow ≤7 mL/min·per 100 g, relative volume of T max ≥16 s with all volumes standardized to z axis coverage, mean permeability surface area product values within T max ≥8 s volume, and mean permeability surface area product values within ipsilesional hemisphere) and clinical variables (NIHSS [National Institutes of Health Stroke Scale], onset to imaging time, baseline systolic blood pressure, blood glucose, serum creatinine, treatment type, and reperfusion status). One-hundred eighteen subjects (22 patients with parenchymal hemorrhage versus 96 without, median baseline NIHSS score of 15) were included in the final analysis. In multivariable regression, noncontrast computed tomography hypoattenuation grade ( P <0.006) and computerized tomography perfusion white matter relative volume of very low cerebral blood volume ( P =0.04) were the only significant variables associated with parenchymal hemorrhage on follow-up imaging (area under the curve, 0.73; 95% confidence interval, 0.63-0.83). Interrater reliability for noncontrast computed tomography hypoattenuation grade was moderate (κ=0.6). Baseline hypoattenuation on noncontrast computed tomography and very low cerebral blood volume on computerized tomography perfusion are associated with development of parenchymal hemorrhage in patients with acute ischemic stroke receiving intravenous alteplase. © 2017 American Heart Association, Inc.

  8. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  9. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less

  10. 76 FR 13984 - Cloud Computing Forum & Workshop III

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...

  11. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization algorithm, as well as the use of external optimized librairies. Part of this work has been supported by the european Prace project (PaRtnership for Advanced Computing in Europe) in the framework of its workpackage 8.

  12. Tracking voice change after thyroidectomy: application of spectral/cepstral analyses.

    PubMed

    Awan, Shaheen N; Helou, Leah B; Stojadinovic, Alexander; Solomon, Nancy Pearl

    2011-04-01

    This study evaluates the utility of perioperative spectral and cepstral acoustic analyses to monitor voice change after thyroidectomy. Perceptual and acoustic analyses were conducted on speech samples (sustained vowel /α/ and CAPE-V sentences) provided by 70 participants (36 women and 34 men) at four study time points: prior to thyroid surgery and 2 weeks, 3 months and 6 months after thyroidectomy. Repeated measures analyses of variance focused on the relative amplitude of the dominant harmonic in the voice signal (cepstral peak prominence, CPP), the ratio of low-to-high spectral energy, and their respective standard deviations (SD). Data were also examined for relationships between acoustic measures and perceptual ratings of overall severity of voice quality. Results showed that perceived overall severity and the acoustic measures of the CPP and its SD (CPPsd) computed from sentence productions were significantly reduced at 2-week post-thyroidectomy for 20 patients (29% of the sample) who had self-reported post-operative voice change. For this same group of patients, the CPP and CPPsd computed from sentence productions improved significantly from 2-weeks post-thyroidectomy to 6-months post-surgery. CPP and CPPsd also correlated well with perceived overall severity (r = -0.68 and -0.79, respectively). Measures of CPP from sustained vowel productions were not as effective as those from sentence productions in reflecting voice deterioration in the post-thyroidectomy patients at the 2-week post-surgery time period, were weaker correlates with perceived overall severity, and were not as effective in discriminating negative voice outcome (NegVO) from normal voice outcome (NormVO) patients as compared to the results from the sentence-level stimuli. Results indicate that spectral/cepstral analysis methods can be used with continuous speech samples to provide important objective data to document the effects of dysphonia in a post-thyroidectomy patient sample. When used in conjunction with patient's self-report and other general measures of vocal dysfunction, the acoustic measures employed in this study contribute to a complete profile of the patient's vocal condition.

  13. Computation of backwater and discharge at width constrictions of heavily vegetated flood plains

    USGS Publications Warehouse

    Schneider, V.R.; Board, J.W.; Colson, B.E.; Lee, F.N.; Druffel, Leroy

    1977-01-01

    The U.S. Geological Survey, cooperated with the Federal Highway Administration and the State Highway Departments of Mississippi, Alabama, and Louisiana, to develop a proposed method for computing backwater and discharge at width constrictions of heavily vegetated flood plains. Data were collected at 20 single opening sites for 31 floods. Flood-plain width varied from 4 to 14 times the bridge opening width. The recurrence intervals of peak discharge ranged from a 2-year flood to greater than a 100-year flood, with a median interval of 6 years. Measured backwater ranged from 0.39 to 3.16 feet. Backwater computed by the present standard Geological Survey method averaged 29 percent less than the measured, and that computed by the currently used Federal Highway Administration method averaged 47 percent less than the measured. Discharge computed by the Survey method averaged 21 percent more then the measured. Analysis of data showed that the flood-plain widths and the Manning 's roughness coefficient are larger than those used to develop the standard methods. A method to more accurately compute backwater and discharge was developed. The difference between the contracted and natural water-surface profiles computed using standard step-backwater procedures is defined as backwater. The energy loss terms in the step-backwater procedure are computed as the product of the geometric mean of the energy slopes and the flow distance in the reach was derived from potential flow theory. The mean error was 1 percent when using the proposed method for computing backwater and 3 percent for computing discharge. (Woodard-USGS)

  14. Inquiry-Based Learning Case Studies for Computing and Computing Forensic Students

    ERIC Educational Resources Information Center

    Campbell, Jackie

    2012-01-01

    Purpose: The purpose of this paper is to describe and discuss the use of specifically-developed, inquiry-based learning materials for Computing and Forensic Computing students. Small applications have been developed which require investigation in order to de-bug code, analyse data issues and discover "illegal" behaviour. The applications…

  15. Effectiveness of Adaptive Statistical Iterative Reconstruction for 64-Slice Dual-Energy Computed Tomography Pulmonary Angiography in Patients With a Reduced Iodine Load: Comparison With Standard Computed Tomography Pulmonary Angiography.

    PubMed

    Lee, Ji Won; Lee, Geewon; Lee, Nam Kyung; Moon, Jin Il; Ju, Yun Hye; Suh, Young Ju; Jeong, Yeon Joo

    2016-01-01

    The aim of the study was to assess the effectiveness of the adaptive statistical iterative reconstruction (ASIR) for dual-energy computed tomography pulmonary angiography (DE-CTPA) with a reduced iodine load. One hundred forty patients referred for chest CT were randomly divided into a DE-CTPA group with a reduced iodine load or a standard CTPA group. Quantitative and qualitative image qualities of virtual monochromatic spectral (VMS) images with filtered back projection (VMS-FBP) and those with 50% ASIR (VMS-ASIR) in the DE-CTPA group were compared. Image qualities of VMS-ASIR images in the DE-CTPA group and ASIR images in the standard CTPA group were also compared. All quantitative and qualitative indices, except attenuation value of pulmonary artery in the VMS-ASIR subgroup, were superior to those in the VMS-FBP subgroup (all P < 0.001). Noise and signal-to-noise ratio of VMS-ASIR images were superior to those of ASIR images in the standard CTPA group (P < 0.001 and P = 0.007, respectively). Regarding qualitative indices, noise was significantly lower in VMS-ASIR images of the DE-CTPA group than in ASIR images of the standard CTPA group (P = 0.001). The ASIR technique tends to improve the image quality of VMS imaging. Dual-energy computed tomography pulmonary angiography with ASIR can reduce contrast medium volume and produce images of comparable quality with those of standard CTPA.

  16. Image-based teleconsultation using smartphones or tablets: qualitative assessment of medical experts.

    PubMed

    Boissin, Constance; Blom, Lisa; Wallis, Lee; Laflamme, Lucie

    2017-02-01

    Mobile health has promising potential in improving healthcare delivery by facilitating access to expert advice. Enabling experts to review images on their smartphone or tablet may save valuable time. This study aims at assessing whether images viewed by medical specialists on handheld devices such as smartphones and tablets are perceived to be of comparable quality as when viewed on a computer screen. This was a prospective study comparing the perceived quality of 18 images on three different display devices (smartphone, tablet and computer) by 27 participants (4 burn surgeons and 23 emergency medicine specialists). The images, presented in random order, covered clinical (dermatological conditions, burns, ECGs and X-rays) and non-clinical subjects and their perceived quality was assessed using a 7-point Likert scale. Differences in devices' quality ratings were analysed using linear regression models for clustered data adjusting for image type and participants' characteristics (age, gender and medical specialty). Overall, the images were rated good or very good in most instances and more so for the smartphone (83.1%, mean score 5.7) and tablet (78.2%, mean 5.5) than for a standard computer (70.6%, mean 5.2). Both handheld devices had significantly higher ratings than the computer screen, even after controlling for image type and participants' characteristics. Nearly all experts expressed that they would be comfortable using smartphones (n=25) or tablets (n=26) for image-based teleconsultation. This study suggests that handheld devices could be a substitute for computer screens for teleconsultation by physicians working in emergency settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Automated serum chloride analysis using the Apple computer

    PubMed Central

    Taylor, Paul J.; Bouska, Rosalie A.

    1988-01-01

    Chloride analysis employing a coulometric technique is a wellestablished method. However, the equipment needed is specialized and somewhat expensive. The purpose of this paper is to report the development of the hardware and software to perform this analysis using an Apple computer to control the coulometric titration, as well as to automate it and to print out the results. The Apple computer is used to control the flow of current in a circuit, which includes silver and platinum electrodes where the following reactions take place: A g → A g + + l e −    ( at silver anode ) 2 H 2 O + 2 e − → 2 O H − + H 2    ( at platinum cathode ) The generated silver ions then react with the chloride ion in the sample to form AgCl. A g + + C l − → A g C l ( s ) When all of the chloride ion has been titrated, the concentration of silver ions in solution increases rapidly, which causes an increase in the current between two silver microelectrodes. This current is converted to a voltage and amplified by a simple circuit. This voltage is read by the analogue-to-digital converter. The computer stops the titration and calculates the chloride ion content of the sample. Thus, the computer controls the apparatus, records the data, and reacts to the data to terminate the analyses and prints out the results and messages to the analyst. Analysis of standards and reference sera indicate the method is rapid, accurate and precise. Application of this apparatus as a teaching aidfor electronics to chemistry and medical students is also described. PMID:18925182

  18. Assessing cultural validity in standardized tests in stem education

    NASA Astrophysics Data System (ADS)

    Gassant, Lunes

    This quantitative ex post facto study examined how race and gender, as elements of culture, influence the development of common misconceptions among STEM students. Primary data came from a standardized test: the Digital Logic Concept Inventory (DLCI) developed by Drs. Geoffrey L. Herman, Michael C. Louis, and Craig Zilles from the University of Illinois at Urbana-Champaign. The sample consisted of a cohort of 82 STEM students recruited from three universities in Northern Louisiana. Microsoft Excel and the Statistical Package for the Social Sciences (SPSS) were used for data computation. Two key concepts, several sub concepts, and 19 misconceptions were tested through 11 items in the DLCI. Statistical analyses based on both the Classical Test Theory (Spearman, 1904) and the Item Response Theory (Lord, 1952) yielded similar results: some misconceptions in the DLCI can reliably be predicted by the Race or the Gender of the test taker. The research is significant because it has shown that some misconceptions in a STEM discipline attracted students with similar ethnic backgrounds differently; thus, leading to the existence of some cultural bias in the standardized test. Therefore the study encourages further research in cultural validity in standardized tests. With culturally valid tests, it will be possible to increase the effectiveness of targeted teaching and learning strategies for STEM students from diverse ethnic backgrounds. To some extent, this dissertation has contributed to understanding, better, the gap between high enrollment rates and low graduation rates among African American students and also among other minority students in STEM disciplines.

  19. Prominence of ichnologically influenced macroporosity in the karst Biscayne aquifer: Stratiform "super-K" zones

    USGS Publications Warehouse

    Cunningham, K.J.; Sukop, M.C.; Huang, H.; Alvarez, P.F.; Curran, H.A.; Renken, R.A.; Dixon, J.F.

    2009-01-01

    A combination of cyclostratigraphic, ichnologic, and borehole geophysical analyses of continuous core holes; tracer-test analyses; and lattice Boltzmann flow simulations was used to quantify biogenic macroporosity and permeability of the Biscayne aquifer, southeastern Florida. Biogenic macroporosity largely manifests as: (1) ichnogenic macroporosity primarily related to postdepositional burrowing activity by callianassid shrimp and fossilization of components of their complex burrow systems (Ophiomorpha); and (2) biomoldic macroporosity originating from dissolution of fossil hard parts, principally mollusk shells. Ophiomorpha-dominated ichno-fabric provides the greatest contribution to hydrologic characteristics in the Biscayne aquifer in a 345 km2 study area. Stratiform tabular-shaped units of thalassinidean-associated macroporosity are commonly confined to the lower part of upward-shallowing high-frequency cycles, throughout aggradational cycles, and, in one case, they stack vertically within the lower part of a high-frequency cycle set. Broad continuity of many of the macroporous units concentrates groundwater flow in extremely permeable passage-ways, thus making the aquifer vulnerable to long-distance transport of contaminants. Ichnogenic macroporosity represents an alternative pathway for concentrated groundwater flow that differs considerably from standard karst flow-system paradigms, which describe groundwater movement through fractures and cavernous dissolution features. Permeabilities were calculated using lattice Boltzmann methods (LBMs) applied to computer renderings assembled from X-ray computed tomography scans of various biogenic macroporous limestone samples. The highest simulated LBM permeabilities were about five orders of magnitude greater than standard laboratory measurements using air-permeability methods, which are limited in their application to extremely permeable macroporous rock samples. Based on their close conformance to analytical solutions for pipe flow, LBMs offer a new means of obtaining accurate permeability values for such materials. We suggest that the stratiform ichnogenic groundwater flow zones have permeabilities even more extreme (???2-5 orders of magnitude higher) than the Jurassic "super-K" zones of the giant Ghawar oil field. The flow zones of the Pleistocene Biscayne aquifer provide examples of ichnogenic macroporosity for comparative analysis of origin and evolution in other carbonate aquifers, as well as petroleum reservoirs. ?? 2008 Geological Society of America.

  20. Identification and simulation of space-time variability of past hydrological drought events in the Limpopo River basin, southern Africa

    NASA Astrophysics Data System (ADS)

    Trambauer, P.; Maskey, S.; Werner, M.; Pappenberger, F.; van Beek, L. P. H.; Uhlenbrook, S.

    2014-08-01

    Droughts are widespread natural hazards and in many regions their frequency seems to be increasing. A finer-resolution version (0.05° × 0.05°) of the continental-scale hydrological model PCRaster Global Water Balance (PCR-GLOBWB) was set up for the Limpopo River basin, one of the most water-stressed basins on the African continent. An irrigation module was included to account for large irrigated areas of the basin. The finer resolution model was used to analyse hydrological droughts in the Limpopo River basin in the period 1979-2010 with a view to identifying severe droughts that have occurred in the basin. Evaporation, soil moisture, groundwater storage and runoff estimates from the model were derived at a spatial resolution of 0.05° (approximately 5 km) on a daily timescale for the entire basin. PCR-GLOBWB was forced with daily precipitation and temperature obtained from the ERA-Interim global atmospheric reanalysis product from the European Centre for Medium-Range Weather Forecasts. Two agricultural drought indicators were computed: the Evapotranspiration Deficit Index (ETDI) and the Root Stress Anomaly Index (RSAI). Hydrological drought was characterised using the Standardized Runoff Index (SRI) and the Groundwater Resource Index (GRI), which make use of the streamflow and groundwater storage resulting from the model. Other more widely used meteorological drought indicators, such as the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evaporation Index (SPEI), were also computed for different aggregation periods. Results show that a carefully set-up, process-based model that makes use of the best available input data can identify hydrological droughts even if the model is largely uncalibrated. The indicators considered are able to represent the most severe droughts in the basin and to some extent identify the spatial variability of droughts. Moreover, results show the importance of computing indicators that can be related to hydrological droughts, and how these add value to the identification of hydrological droughts and floods and the temporal evolution of events that would otherwise not have been apparent when considering only meteorological indicators. In some cases, meteorological indicators alone fail to capture the severity of the hydrological drought. Therefore, a combination of some of these indicators (e.g. SPEI-3, SRI-6 and SPI-12 computed together) is found to be a useful measure for identifying agricultural to long-term hydrological droughts in the Limpopo River basin. Additionally, it was possible to undertake a characterisation of the drought severity in the basin, indicated by its time of occurrence, duration and intensity.

  1. East Pacific Rise axial structure from a joint tomographic inversion of traveltimes picked on downward continued and standard shot gathers collected by 3D MCS surveying

    NASA Astrophysics Data System (ADS)

    Newman, Kori; Nedimović, Mladen; Delescluse, Matthias; Menke, William; Canales, J. Pablo; Carbotte, Suzanne; Carton, Helene; Mutter, John

    2010-05-01

    We present traveltime tomographic models along closely spaced (~250 m), strike-parallel profiles that flank the axis of the East Pacific Rise at 9°41' - 9°57' N. The data were collected during a 3D (multi-streamer) multichannel seismic (MCS) survey of the ridge. Four 6-km long hydrophone streamers were towed by the ship along three along-axis sail lines, yielding twelve possible profiles over which to compute tomographic models. Based on the relative location between source-receiver midpoints and targeted subsurface structures, we have chosen to compute models for four of those lines. MCS data provide for a high density of seismic ray paths with which to constrain the model. Potentially, travel times for ~250,000 source-receiver pairs can be picked over the 30 km length of each model. However, such data density does not enhance the model resolution, so, for computational efficiency, the data are decimated so that ~15,000 picks per profile are used. Downward continuation of the shot gathers simulates an experimental geometry in which the sources and receivers are positioned just above the sea floor. This allows the shallowest sampling refracted arrivals to be picked and incorporated into the inversion whereas they would otherwise not be usable with traditional first-arrival travel-time tomographic techniques. Some of the far-offset deep-penetrating 2B refractions cannot be picked on the downward continued gathers due to signal processing artifacts. For this reason, we run a joint inversion by also including 2B traveltime picks from standard shot gathers. Uppermost velocity structure (seismic layer 2A thickness and velocity) is primarily constrained from 1D inversion of the nearest offset (<500 m) source-receiver travel-time picks for each downward continued shot gather. Deeper velocities are then computed in a joint 2D inversion that uses all picks from standard and downward continued shot gathers and incorporates the 1D results into the starting model. The resulting velocity models extend ~1 km into the crust. Preliminary results show thicker layer 2A and faster layer 2A velocities at fourth order ridge segment boundaries. Additionally, layer 2A thickens north of 9° 52' N, which is consistent with earlier investigations of this ridge segment. Slower layer 2B velocities are resolved in the vicinity of documented hydrothermal vent fields. We anticipate that additional analyses of the results will yield further insight into fine scale variations in near-axis mid-ocean ridge structure.

  2. Sensory processing during viewing of cinematographic material: Computational modeling and functional neuroimaging

    PubMed Central

    Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano

    2013-01-01

    The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified “sensory” networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom–up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches. PMID:23202431

  3. Elastic-plastic finite-element analyses of thermally cycled single-edge wedge specimens

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1982-01-01

    Elastic-plastic stress-strain analyses were performed for single-edge wedge alloys subjected to thermal cycling in fluidized beds. Three cases (NASA TAZ-8A alloy under one cycling condition and 316 stainless steel alloy under two cycling conditions) were analyzed by using the MARC nonlinear, finite-element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions that used the NASTRAN and ISO3DQ computer programs. The NASA TAZ-8A case exhibited no plastic strains, and the elastic and elastic-plastic analyses gave identical results. Elastic-plastic analyses of the 316 stainless steel alloy showed plastic strain reversal with a shift of the mean stresses in the compressive direction. The maximum equivalent total strain ranges for these cases were 13 to 22 percent greater than that calculated from elastic analyses.

  4. 75 FR 64258 - Cloud Computing Forum & Workshop II

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop II to be held on November 4 and 5, 2010. This workshop will provide information on a Cloud Computing Roadmap Strategy as well as provide...

  5. Defining Computational Thinking for Mathematics and Science Classrooms

    ERIC Educational Resources Information Center

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-01-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new…

  6. Economic evaluations and usefulness of standardized nursing terminologies.

    PubMed

    Stone, Patricia W; Lee, Nam-Ju; Giannini, Melinna; Bakken, Suzanne

    2004-01-01

    To review different types of economic analyses commonly found in healthcare literature, discuss methodologic considerations in framing economic analyses, identify useful resources for economic evaluations, and describe the current and potential roles of standardized nursing terminologies in providing cost and outcome data for economic analysis. The Advanced Billing Concepts Code Resource-based Relative Value Scale and Nursing Outcomes Classification. Using case studies, the applicability of standardized nursing terminologies in cost-effectiveness analysis is demonstrated. While there is potential to inform specific questions, comparisons across analyses are limited because of the many outcome measures. Including a standardized quality-of-life measure in nursing terminologies would allow for the calculation of accepted outcome measures and dollars per quality adjusted life years gained. The nurse's ability to assess and contribute to all aspects of rigorous economic evidence is an essential competency for responsible practice.

  7. Environmental Analysis

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Burns & McDonnell Engineering's environmental control study is assisted by NASA's Computer Software Management and Information Center's programs in environmental analyses. Company is engaged primarily in design of such facilities as electrical utilities, industrial plants, wastewater treatment systems, dams and reservoirs and aviation installations. Company also conducts environmental engineering analyses and advises clients as to the environmental considerations of a particular construction project. Company makes use of many COSMIC computer programs which have allowed substantial savings.

  8. Status of emerging standards for data definitions and transfer in the petroleum industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winczewski, L.M.

    1991-03-01

    Leading-edge hardware and software to store, retrieve, process, analyze, visualize, and interpret geoscience and petroleum data are improving continuously. A babel of definitions and formats for common industry data items limits the overall effectiveness of these computer-aided exploration and production tools. Custom data conversion required to load applications causes delays and exposes data content to error and degradation. Emerging industry-wide standards for management of geoscience and petroleum-related data are poised to overcome long-standing internal barriers to the full exploitation of these high-tech hardware/software systems. Industry technical organizations, such as AAPG, SEG, and API, have been actively pursuing industry-wide standards formore » data transfer, data definitions, and data models. These standard-defining groups are non-fee and solicit active participation from the entire petroleum community. The status of the most active of these groups is presented here. Data transfer standards are being pursued within AAPG (AAPG-B Data Transfer Standard), API (DLIS, for log data) and SEG (SEG-DEF, for seismic data). Converging data definitions, models, and glossaries are coming from the Petroleum Industry Data Dictionary Group (PIDD) and from subcommittees of the AAPG Computer Applications Committee. The National Computer Graphics Association is promoting development of standards for transfer of geographically oriented data. The API Well-Number standard is undergoing revision.« less

  9. Computing correct truncated excited state wavefunctions

    NASA Astrophysics Data System (ADS)

    Bacalis, N. C.; Xiong, Z.; Zang, J.; Karaoulanis, D.

    2016-12-01

    We demonstrate that, if a wave function's truncated expansion is small, then the standard excited states computational method, of optimizing one "root" of a secular equation, may lead to an incorrect wave function - despite the correct energy according to the theorem of Hylleraas, Undheim and McDonald - whereas our proposed method [J. Comput. Meth. Sci. Eng. 8, 277 (2008)] (independent of orthogonality to lower lying approximants) leads to correct reliable small truncated wave functions. The demonstration is done in He excited states, using truncated series expansions in Hylleraas coordinates, as well as standard configuration-interaction truncated expansions.

  10. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients.

    PubMed

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten

    2015-10-01

    This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. PHYSICO: An UNIX based Standalone Procedure for Computation of Individual and Group Properties of Protein Sequences.

    PubMed

    Gupta, Parth Sarthi Sen; Banerjee, Shyamashree; Islam, Rifat Nawaz Ul; Mondal, Sudipta; Mondal, Buddhadev; Bandyopadhyay, Amal K

    2014-01-01

    In the genomic and proteomic era, efficient and automated analyses of sequence properties of protein have become an important task in bioinformatics. There are general public licensed (GPL) software tools to perform a part of the job. However, computations of mean properties of large number of orthologous sequences are not possible from the above mentioned GPL sets. Further, there is no GPL software or server which can calculate window dependent sequence properties for a large number of sequences in a single run. With a view to overcome above limitations, we have developed a standalone procedure i.e. PHYSICO, which performs various stages of computation in a single run based on the type of input provided either in RAW-FASTA or BLOCK-FASTA format and makes excel output for: a) Composition, Class composition, Mean molecular weight, Isoelectic point, Aliphatic index and GRAVY, b) column based compositions, variability and difference matrix, c) 25 kinds of window dependent sequence properties. The program is fast, efficient, error free and user friendly. Calculation of mean and standard deviation of homologous sequences sets, for comparison purpose when relevant, is another attribute of the program; a property seldom seen in existing GPL softwares. PHYSICO is freely available for non-commercial/academic user in formal request to the corresponding author akbanerjee@biotech.buruniv.ac.in.

  12. PHYSICO: An UNIX based Standalone Procedure for Computation of Individual and Group Properties of Protein Sequences

    PubMed Central

    Gupta, Parth Sarthi Sen; Banerjee, Shyamashree; Islam, Rifat Nawaz Ul; Mondal, Sudipta; Mondal, Buddhadev; Bandyopadhyay, Amal K

    2014-01-01

    In the genomic and proteomic era, efficient and automated analyses of sequence properties of protein have become an important task in bioinformatics. There are general public licensed (GPL) software tools to perform a part of the job. However, computations of mean properties of large number of orthologous sequences are not possible from the above mentioned GPL sets. Further, there is no GPL software or server which can calculate window dependent sequence properties for a large number of sequences in a single run. With a view to overcome above limitations, we have developed a standalone procedure i.e. PHYSICO, which performs various stages of computation in a single run based on the type of input provided either in RAW-FASTA or BLOCK-FASTA format and makes excel output for: a) Composition, Class composition, Mean molecular weight, Isoelectic point, Aliphatic index and GRAVY, b) column based compositions, variability and difference matrix, c) 25 kinds of window dependent sequence properties. The program is fast, efficient, error free and user friendly. Calculation of mean and standard deviation of homologous sequences sets, for comparison purpose when relevant, is another attribute of the program; a property seldom seen in existing GPL softwares. Availability PHYSICO is freely available for non-commercial/academic user in formal request to the corresponding author akbanerjee@biotech.buruniv.ac.in PMID:24616564

  13. Unperturbed Schelling Segregation in Two or Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2016-09-01

    Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).

  14. Computerized image analysis of cell-cell interactions in human renal tissue by using multi-channel immunoflourescent confocal microscopy

    NASA Astrophysics Data System (ADS)

    Peng, Yahui; Jiang, Yulei; Liarski, Vladimir M.; Kaverina, Natalya; Clark, Marcus R.; Giger, Maryellen L.

    2012-03-01

    Analysis of interactions between B and T cells in tubulointerstitial inflammation is important for understanding human lupus nephritis. We developed a computer technique to perform this analysis, and compared it with manual analysis. Multi-channel immunoflourescent-microscopy images were acquired from 207 regions of interest in 40 renal tissue sections of 19 patients diagnosed with lupus nephritis. Fresh-frozen renal tissue sections were stained with combinations of immunoflourescent antibodies to membrane proteins and counter-stained with a cell nuclear marker. Manual delineation of the antibodies was considered as the reference standard. We first segmented cell nuclei and cell membrane markers, and then determined corresponding cell types based on the distances between cell nuclei and specific cell-membrane marker combinations. Subsequently, the distribution of the shortest distance from T cell nuclei to B cell nuclei was obtained and used as a surrogate indicator of cell-cell interactions. The computer and manual analyses results were concordant. The average absolute difference was 1.1+/-1.2% between the computer and manual analysis results in the number of cell-cell distances of 3 μm or less as a percentage of the total number of cell-cell distances. Our computerized analysis of cell-cell distances could be used as a surrogate for quantifying cell-cell interactions as either an automated and quantitative analysis or for independent confirmation of manual analysis.

  15. Longitudinal study of radiation exposure in computed tomography with an in-house developed dose monitoring system

    NASA Astrophysics Data System (ADS)

    Renger, Bernhard; Rummeny, Ernst J.; Noël, Peter B.

    2013-03-01

    During the last decades, the reduction of radiation exposure especially in diagnostic computed tomography is one of the most explored topics. In the same time, it seems challenging to quantify the long-term clinical dose reduction with regard to new hardware as well as software solutions. To overcome this challenge, we developed a Dose Monitoring System (DMS), which collects information from PACS, RIS, MPPS and structured reports. The integration of all sources overcomes the weaknesses of single systems. To gather all possible information, we integrated an optical character recognition system to extract, for example, information from the CT-dose-report. All collected data are transferred to a database for further evaluation, e.g., for calculations of effective as well as organ doses. The DMS provides a single database for tracking all essential study and patient specific information across different modality as well as different vendors. As an initial study, we longitudinally investigated the dose reduction in CT examination when employing a noise-suppressing reconstruction algorithm. For this examination type a significant long-term reduction in radiation exposure is reported, when comparing to a CT-system with standard reconstruction. In summary our DMS tool not only enables us to track radiation exposure on daily bases but further enables to analyses the long term effect of new dose saving strategies. In the future the statistical analyses of all retrospective data, which are available in a modern imaging department, will provide a unique overview of advances in reduction of radiation exposure.

  16. A portable MPI-based parallel vector template library

    NASA Technical Reports Server (NTRS)

    Sheffler, Thomas J.

    1995-01-01

    This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C++ by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of C or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.

  17. A Portable MPI-Based Parallel Vector Template Library

    NASA Technical Reports Server (NTRS)

    Sheffler, Thomas J.

    1995-01-01

    This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C + + by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of c or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.

  18. Technology in practice – GP computer use by age.

    PubMed

    Henderson, Joan; Pollack, Allan; Gordon, Julie; Miller, Graeme

    2014-12-01

    Since 2005, more than 95% of general practitioners (GPs) have had access to computers in their clinical work. We have analysed the most recent 2 years of BEACH data (April 2012-March 2014) to determine whether GP age affects clinical computer use.

  19. Visualising biological data: a semantic approach to tool and database integration

    PubMed Central

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-01-01

    Motivation In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customised for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. Methods To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. Results The toolkit, named Utopia, is freely available from . PMID:19534744

  20. Visualising biological data: a semantic approach to tool and database integration.

    PubMed

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-06-16

    In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customized for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. The toolkit, named Utopia, is freely available from http://utopia.cs.man.ac.uk/.

Top