Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Nair, Pradeep S; John, Eugene B
2007-01-01
Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.
NASA Technical Reports Server (NTRS)
Ramsey, J. W., Jr.; Taylor, J. T.; Wilson, J. F.; Gray, C. E., Jr.; Leatherman, A. D.; Rooker, J. R.; Allred, J. W.
1976-01-01
The results of extensive computer (finite element, finite difference and numerical integration), thermal, fatigue, and special analyses of critical portions of a large pressurized, cryogenic wind tunnel (National Transonic Facility) are presented. The computer models, loading and boundary conditions are described. Graphic capability was used to display model geometry, section properties, and stress results. A stress criteria is presented for evaluation of the results of the analyses. Thermal analyses were performed for major critical and typical areas. Fatigue analyses of the entire tunnel circuit are presented.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
An Analysis of the Use of Cloud Computing among University Lecturers: A Case Study in Zimbabwe
ERIC Educational Resources Information Center
Musungwini, Samuel; Mugoniwa, Beauty; Furusa, Samuel Simbarashe; Rebanowako, Taurai George
2016-01-01
Cloud computing is a novel model of computing that may bring extensive benefits to users, institutions, businesses and academics, while at the same time also giving rise to new risks and challenges. This study looked at the benefits of using Google docs by researchers and academics and analysing the factors affecting the adoption and use of the…
Study to document low thrust trajectory optimization programs HILTOP and ASTOP
NASA Technical Reports Server (NTRS)
Horsewood, J. L.; Mann, F. I.; Pines, S.
1974-01-01
Detailed documentation of the HILTOP and ASTOP computer programs is presented along with results of the analyses of the possible extension of the HILTOP program and results of an extra-ecliptic mission study performed with HILTOP.
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-06-01
The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.
Social correlates of leisure-time sedentary behaviours in Canadian adults.
Huffman, S; Szafron, M
2017-03-01
Research on the correlates of sedentary behaviour among adults is needed to design health interventions to modify this behaviour. This study explored the associations of social correlates with leisure-time sedentary behaviour of Canadian adults, and whether these associations differ between different types of sedentary behaviour. A sample of 12,021 Canadian adults was drawn from the 2012 Canadian Community Health Survey, and analyzed using binary logistic regression to model the relationships that marital status, the presence of children in the household, and social support have with overall time spent sitting, using a computer, playing video games, watching television, and reading during leisure time. Covariates included gender, age, education, income, employment status, perceived health, physical activity level, body mass index (BMI), and province or territory of residence. Extensive computer time was primarily negatively related to being in a common law relationship, and primarily positively related to being single/never married. Being single/never married was positively associated with extensive sitting time in men only. Having children under 12 in the household was protective against extensive video game and reading times. Increasing social support was negatively associated with extensive computer time in men and women, while among men increasing social support was positively associated with extensive sitting time. Computer, video game, television, and reading time have unique correlates among Canadian adults. Marital status, the presence of children in the household, and social support should be considered in future analyses of sedentary activities in adults.
Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S. Cenk; Shimizu, Chisato; Burns, Jane C.; Wright, Victoria J.; Png, Eileen; Hibberd, Martin L.; Lloyd, David D.; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S.; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila
2017-01-01
Abstract Motivation: We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. Results: To demonstrate PRINCESS’ performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). Availability and Implementation: https://github.com/achenfengb/PRINCESS_opensource Contact: shw070@ucsd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28065902
Numerical, mathematical models of water and chemical movement in soils are used as decision aids for determining soil screening levels (SSLs) of radionuclides in the unsaturated zone. Many models require extensive input parameters which include uncertainty due to soil variabil...
Silberstein, M.; Tzemach, A.; Dovgolevsky, N.; Fishelson, M.; Schuster, A.; Geiger, D.
2006-01-01
Computation of LOD scores is a valuable tool for mapping disease-susceptibility genes in the study of Mendelian and complex diseases. However, computation of exact multipoint likelihoods of large inbred pedigrees with extensive missing data is often beyond the capabilities of a single computer. We present a distributed system called “SUPERLINK-ONLINE,” for the computation of multipoint LOD scores of large inbred pedigrees. It achieves high performance via the efficient parallelization of the algorithms in SUPERLINK, a state-of-the-art serial program for these tasks, and through the use of the idle cycles of thousands of personal computers. The main algorithmic challenge has been to efficiently split a large task for distributed execution in a highly dynamic, nondedicated running environment. Notably, the system is available online, which allows computationally intensive analyses to be performed with no need for either the installation of software or the maintenance of a complicated distributed environment. As the system was being developed, it was extensively tested by collaborating medical centers worldwide on a variety of real data sets, some of which are presented in this article. PMID:16685644
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
Reanalysis, compatibility and correlation in analysis of modified antenna structures
NASA Technical Reports Server (NTRS)
Levy, R.
1989-01-01
A simple computational procedure is synthesized to process changes in the microwave-antenna pathlength-error measure when there are changes in the antenna structure model. The procedure employs structural modification reanalysis methods combined with new extensions of correlation analysis to provide the revised rms pathlength error. Mainframe finite-element-method processing of the structure model is required only for the initial unmodified structure, and elementary postprocessor computations develop and deal with the effects of the changes. Several illustrative computational examples are included. The procedure adapts readily to processing spectra of changes for parameter studies or sensitivity analyses.
Does H → γγ taste like vanilla new physics?
NASA Astrophysics Data System (ADS)
Almeida, L. G.; Bertuzzo, E.; Machado, P. A. N.; Funchal, R. Zukanovich
2012-11-01
We analyse the interplay between the Higgs to diphoton rate and electroweak precision measurements constraints in extensions of the Standard Model with new uncolored charged fermions that do not mix with the ordinary ones. We also compute the pair production cross sections for the lightest fermion and compare them with current bounds.
An object oriented Python interface for atomistic simulations
NASA Astrophysics Data System (ADS)
Hynninen, T.; Himanen, L.; Parkkinen, V.; Musso, T.; Corander, J.; Foster, A. S.
2016-01-01
Programmable simulation environments allow one to monitor and control calculations efficiently and automatically before, during, and after runtime. Environments directly accessible in a programming environment can be interfaced with powerful external analysis tools and extensions to enhance the functionality of the core program, and by incorporating a flexible object based structure, the environments make building and analysing computational setups intuitive. In this work, we present a classical atomistic force field with an interface written in Python language. The program is an extension for an existing object based atomistic simulation environment.
Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S Cenk; Shimizu, Chisato; Burns, Jane C; Wright, Victoria J; Png, Eileen; Hibberd, Martin L; Lloyd, David D; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila
2017-03-15
We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. To demonstrate PRINCESS' performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). https://github.com/achenfengb/PRINCESS_opensource. shw070@ucsd.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasser, D.W.
1978-03-01
EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of EASI Graphics and illustrates its application with some examples.
Extending Landauer's bound from bit erasure to arbitrary computation
NASA Astrophysics Data System (ADS)
Wolpert, David
The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.
3-D modeling of ductile tearing using finite elements: Computational aspects and techniques
NASA Astrophysics Data System (ADS)
Gullerud, Arne Stewart
This research focuses on the development and application of computational tools to perform large-scale, 3-D modeling of ductile tearing in engineering components under quasi-static to mild loading rates. Two standard models for ductile tearing---the computational cell methodology and crack growth controlled by the crack tip opening angle (CTOA)---are described and their 3-D implementations are explored. For the computational cell methodology, quantification of the effects of several numerical issues---computational load step size, procedures for force release after cell deletion, and the porosity for cell deletion---enables construction of computational algorithms to remove the dependence of predicted crack growth on these issues. This work also describes two extensions of the CTOA approach into 3-D: a general 3-D method and a constant front technique. Analyses compare the characteristics of the extensions, and a validation study explores the ability of the constant front extension to predict crack growth in thin aluminum test specimens over a range of specimen geometries, absolutes sizes, and levels of out-of-plane constraint. To provide a computational framework suitable for the solution of these problems, this work also describes the parallel implementation of a nonlinear, implicit finite element code. The implementation employs an explicit message-passing approach using the MPI standard to maintain portability, a domain decomposition of element data to provide parallel execution, and a master-worker organization of the computational processes to enhance future extensibility. A linear preconditioned conjugate gradient (LPCG) solver serves as the core of the solution process. The parallel LPCG solver utilizes an element-by-element (EBE) structure of the computations to permit a dual-level decomposition of the element data: domain decomposition of the mesh provides efficient coarse-grain parallel execution, while decomposition of the domains into blocks of similar elements (same type, constitutive model, etc.) provides fine-grain parallel computation on each processor. A major focus of the LPCG solver is a new implementation of the Hughes-Winget element-by-element (HW) preconditioner. The implementation employs a weighted dependency graph combined with a new coloring algorithm to provide load-balanced scheduling for the preconditioner and overlapped communication/computation. This approach enables efficient parallel application of the HW preconditioner for arbitrary unstructured meshes.
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA
2008-01-01
Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776
NASA Technical Reports Server (NTRS)
Halford, Gary R.; Shah, Ashwin; Arya, Vinod K.; Krause, David L.; Bartolotta, Paul A.
2002-01-01
Deep-space missions require onboard electric power systems with reliable design lifetimes of up to 10 yr and beyond. A high-efficiency Stirling radioisotope power system is a likely candidate for future deep-space missions and Mars rover applications. To ensure ample durability, the structurally critical heater head of the Stirling power convertor has undergone extensive computational analyses of operating temperatures (up to 650 C), stresses, and creep resistance of the thin-walled Inconel 718 bill of material. Durability predictions are presented in terms of the probability of survival. A benchmark structural testing program has commenced to support the analyses. This report presents the current status of durability assessments.
Campos-Filho, N; Franco, E L
1989-02-01
A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.
An interactive environment for agile analysis and visualization of ChIP-sequencing data.
Lerdrup, Mads; Johansen, Jens Vilstrup; Agrawal-Singh, Shuchi; Hansen, Klaus
2016-04-01
To empower experimentalists with a means for fast and comprehensive chromatin immunoprecipitation sequencing (ChIP-seq) data analyses, we introduce an integrated computational environment, EaSeq. The software combines the exploratory power of genome browsers with an extensive set of interactive and user-friendly tools for genome-wide abstraction and visualization. It enables experimentalists to easily extract information and generate hypotheses from their own data and public genome-wide datasets. For demonstration purposes, we performed meta-analyses of public Polycomb ChIP-seq data and established a new screening approach to analyze more than 900 datasets from mouse embryonic stem cells for factors potentially associated with Polycomb recruitment. EaSeq, which is freely available and works on a standard personal computer, can substantially increase the throughput of many analysis workflows, facilitate transparency and reproducibility by automatically documenting and organizing analyses, and enable a broader group of scientists to gain insights from ChIP-seq data.
Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano
2013-01-01
The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.
2013-03-01
of coarser-scale materials and structures containing Kevlar fibers (e.g., yarns, fabrics, plies, lamina, and laminates ). Journal of Materials...Multi-Length Scale-Enriched Continuum-Level Material Model for Kevlar -Fiber-Reinforced Polymer-Matrix Composites M. Grujicic, B. Pandurangan, J.S...extensive set of molecular-level computational analyses regarding the role of various microstructural/morphological defects on the Kevlar fiber
Alternate concepts study extension. Volume 2: Part 4: Avionics
NASA Technical Reports Server (NTRS)
1971-01-01
A recommended baseline system is presented along with alternate avionics systems, Mark 2 avionics, booster avionics, and a cost summary. Analyses and discussions are included on the Mark 1 orbiter avionics subsystems, electrical ground support equipment, and the computer programs. Results indicate a need to define all subsystems of the baseline system, an installation study to determine the impact on the crew station, and a study on access for maintenance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
liu, feng
This theoretical project has been carried out in close interaction with the experimental project at UW-Madison under the same title led by PI Max Lagally and co-PI Mark Eriksson. Extensive computational studies have been performed to address a broad range of topics from atomic structure, stability, mechanical property, to electronic structure, optoelectronic and transport properties of various nanoarchitectures in the context of Si and other solid nanomembranes. These have been done by using combinations of different theoretical and computational approaches, ranging from first-principles calculations and molecular dynamics (MD) simulations to finite-element (FE) analyses and continuum modeling.
Ultrasonic Doppler measurement of renal artery blood flow
NASA Technical Reports Server (NTRS)
Freund, W. R.; Meindl, J. D.
1975-01-01
An extensive evaluation of the practical and theoretical limitations encountered in the use of totally implantable CW Doppler flowmeters is provided. Theoretical analyses, computer models, in-vitro and in-vivo calibration studies describe the sources and magnitudes of potential errors in the measurement of blood flow through the renal artery, as well as larger vessels in the circulatory system. The evaluation of new flowmeter/transducer systems and their use in physiological investigations is reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
ELAV Links Paused Pol II to Alternative Polyadenylation in the Drosophila Nervous System
Oktaba, Katarzyna; Zhang, Wei; Lotz, Thea Sabrina; Jun, David Jayhyun; Lemke, Sandra Beatrice; Ng, Samuel Pak; Esposito, Emilia; Levine, Michael; Hilgers, Valérie
2014-01-01
SUMMARY Alternative polyadenylation (APA) has been implicated in a variety of developmental and disease processes. A particularly dramatic form of APA occurs in the developing nervous system of flies and mammals, whereby various developmental genes undergo coordinate 3′ UTR extension. In Drosophila, the RNA-binding protein ELAV inhibits RNA processing at proximal polyadenylation sites, thereby fostering the formation of exceptionally long 3′ UTRs. Here, we present evidence that paused Pol II promotes recruitment of ELAV to extended genes. Replacing promoters of extended genes with heterologous promoters blocks normal 3′ extension in the nervous system, while extension-associated promoters can induce 3′ extension in ectopic tissues expressing ELAV. Computational analyses suggest that promoter regions of extended genes tend to contain paused Pol II and associated cis-regulatory elements such as GAGA. ChIP-Seq assays identify ELAV in the promoter regions of extended genes. Our study provides evidence for a regulatory link between promoter-proximal pausing and APA. PMID:25544561
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, Mercedes C.
2008-01-01
As NASA moves towards developing technologies needed to implement its new Exploration program, studies conducted for Apollo in the 1960's to understand the rollover stability of capsules landing are being revisited. Although rigid body kinematics analyses of the roll-over behavior of capsules on impact provided critical insight to the Apollo problem, extensive ground test programs were also used. For the new Orion spacecraft being developed to implement today's Exploration program, new air-bag designs have improved sufficiently for NASA to consider their use to mitigate landing loads to ensure crew safety and to enable re-usability of the capsule. Simple kinematics models provide only limited understanding of the behavior of these air bag systems, and more sophisticated tools must be used. In particular, NASA and its contractors are using the LS-Dyna nonlinear simulation code for impact response predictions of the full Orion vehicle with air bags by leveraging the extensive air bag prediction work previously done by the automotive industry. However, even in today's computational environment, these analyses are still high-dimensional, time consuming, and computationally intensive. To alleviate the computational burden, this paper presents an approach that uses deterministic sampling techniques and an adaptive response surface method to not only use existing LS-Dyna solutions but also to interpolate from LS-Dyna solutions to predict the stability boundaries for a capsule on airbags. Results for the stability boundary in terms of impact velocities, capsule attitude, impact plane orientation, and impact surface friction are discussed.
10 CFR 76.74 - Computation and extension of time.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...
10 CFR 76.74 - Computation and extension of time.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...
10 CFR 76.74 - Computation and extension of time.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...
10 CFR 76.74 - Computation and extension of time.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...
10 CFR 76.74 - Computation and extension of time.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Computation and extension of time. 76.74 Section 76.74 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Certification § 76.74 Computation and extension of time. (a) In computing any period of time, the day of the act...
Domain decomposition for aerodynamic and aeroacoustic analyses, and optimization
NASA Technical Reports Server (NTRS)
Baysal, Oktay
1995-01-01
The overarching theme was the domain decomposition, which intended to improve the numerical solution technique for the partial differential equations at hand; in the present study, those that governed either the fluid flow, or the aeroacoustic wave propagation, or the sensitivity analysis for a gradient-based optimization. The role of the domain decomposition extended beyond the original impetus of discretizing geometrical complex regions or writing modular software for distributed-hardware computers. It induced function-space decompositions and operator decompositions that offered the valuable property of near independence of operator evaluation tasks. The objectives have gravitated about the extensions and implementations of either the previously developed or concurrently being developed methodologies: (1) aerodynamic sensitivity analysis with domain decomposition (SADD); (2) computational aeroacoustics of cavities; and (3) dynamic, multibody computational fluid dynamics using unstructured meshes.
Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano
2013-01-01
The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard “condition-based” designs, as well as “computational” methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli. PMID:24194828
Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.
2013-01-01
SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (SSRs; for example, microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains three analysis modules along with a fourth control module that can be used to automate analyses of large volumes of data. The modules are used to (1) identify the subset of paired-end sequences that pass quality standards, (2) align paired-end reads into a single composite DNA sequence, and (3) identify sequences that possess microsatellites conforming to user specified parameters. Each of the three separate analysis modules also can be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc). All modules are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, Windows). The program suite relies on a compiled Python extension module to perform paired-end alignments. Instructions for compiling the extension from source code are provided in the documentation. Users who do not have Python installed on their computers or who do not have the ability to compile software also may choose to download packaged executable files. These files include all Python scripts, a copy of the compiled extension module, and a minimal installation of Python in a single binary executable. See program documentation for more information.
NASA Technical Reports Server (NTRS)
Dreisbach, R. L. (Editor)
1979-01-01
The input data and execution control statements for the ATLAS integrated structural analysis and design system are described. It is operational on the Control Data Corporation (CDC) 6600/CYBER computers in a batch mode or in a time-shared mode via interactive graphic or text terminals. ATLAS is a modular system of computer codes with common executive and data base management components. The system provides an extensive set of general-purpose technical programs with analytical capabilities including stiffness, stress, loads, mass, substructuring, strength design, unsteady aerodynamics, vibration, and flutter analyses. The sequence and mode of execution of selected program modules are controlled via a common user-oriented language.
A sediment graph model based on SCS-CN method
NASA Astrophysics Data System (ADS)
Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.
2008-01-01
SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.
Access to pedestrian roads, daily activities, and physical performance of adolescents.
Sjolie, A N
2000-08-01
A cross-sectional study using a questionnaire and physical tests was performed. To study how access to pedestrian roads and daily activities are related to low back strength, low back mobility, and hip mobility in adolescents. Although many authorities express concern about the passive lifestyle of adolescents, little is known about associations between daily activities and physical performance. This study compared 38 youths in a community lacking access to pedestrian roads with 50 youths in nearby area providing excellent access to pedestrian roads. A standardized questionnaire was used to obtain data about pedestrian roads, school journeys, and activities from the local authorities and the pupils. Low back strength was tested as static endurance strength, low back mobility by modified Schober techniques, and hip mobility by goniometer. For statistical analyses, a P value of 0.05 or less determined significance. In the area using school buses, the pupils had less low back extension, less hamstring flexibility, and less hip abduction, flexion, and extension than pupils in the area with pedestrian roads. Multivariate analyses showed no associations between walking or bicycling to school and anatomic function, but regular walking or bicycling to leisure-time activities associated positively with low back strength, low back extension, hip flexion, and extension. Distance by school bus associated negatively with hip abduction, hip flexion, hip extension, and hamstring flexibility (P<0.001). Time spent on television or computer associated negatively but insignificantly with low back strength, hamstring flexibility, hip abduction, and flexion (P<0.1). The results indicate that access to pedestrian roads and other lifestyle factors are associated with physical performance.
A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service
Yin, Fan; Tang, Xiaohu
2017-01-01
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching. PMID:28696395
Equipment for linking the AutoAnalyzer on-line to a computer
Simpson, D.; Sims, G. E.; Harrison, M. I.; Whitby, L. G.
1971-01-01
An Elliott 903 computer with 8K central core store and magnetic tape backing store has been operated for approximately 20 months in a clinical chemistry laboratory. Details of the equipment designed for linking AutoAnalyzers on-line to the computer are described, and data presented concerning the time required by the computer for different processes. The reliability of the various components in daily operation is discussed. Limitations in the system's capabilities have been defined, and ways of overcoming these are delineated. At present, routine operations include the preparation of worksheets for a limited range of tests (five channels), monitoring of up to 11 AutoAnalyzer channels at a time on a seven-day week basis (with process control and automatic calculation of results), and the provision of quality control data. Cumulative reports can be printed out on those analyses for which computer-prepared worksheets are provided but the system will require extension before these can be issued sufficiently rapidly for routine use. PMID:5551384
Yang, Xue; Yin, Fan; Tang, Xiaohu
2017-07-11
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching.
CSB: a Python framework for structural bioinformatics.
Kalev, Ivan; Mechelke, Martin; Kopec, Klaus O; Holder, Thomas; Carstens, Simeon; Habeck, Michael
2012-11-15
Computational Structural Biology Toolbox (CSB) is a cross-platform Python class library for reading, storing and analyzing biomolecular structures with rich support for statistical analyses. CSB is designed for reusability and extensibility and comes with a clean, well-documented API following good object-oriented engineering practice. Stable release packages are available for download from the Python Package Index (PyPI) as well as from the project's website http://csb.codeplex.com. ivan.kalev@gmail.com or michael.habeck@tuebingen.mpg.de
Analytical and experimental vibration studies of a 1/8-scale shuttle orbiter
NASA Technical Reports Server (NTRS)
Pinson, L. D.
1975-01-01
Natural frequencies and mode shapes for four symmetric vibration modes and four antisymmetric modes are compared with predictions based on NASTRAN finite-element analyses. Initial predictions gave poor agreement with test data; an extensive investigation revealed that the major factors influencing agreement were out-of-plane imperfections in fuselage panels and a soft fin-fuselage connection. Computations with a more refined analysis indicated satisfactory frequency predictions for all modes studied, within 11 percent of experimental values.
Computer aided analysis, simulation and optimisation of thermal sterilisation processes.
Narayanan, C M; Banerjee, Arindam
2013-04-01
Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.
LOD score exclusion analyses for candidate genes using random population samples.
Deng, H W; Li, J; Recker, R R
2001-05-01
While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.
VALIDATION OF ANSYS FINITE ELEMENT ANALYSIS SOFTWARE
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMM, E.R.
2003-06-27
This document provides a record of the verification and Validation of the ANSYS Version 7.0 software that is installed on selected CH2M HILL computers. The issues addressed include: Software verification, installation, validation, configuration management and error reporting. The ANSYS{reg_sign} computer program is a large scale multi-purpose finite element program which may be used for solving several classes of engineering analysis. The analysis capabilities of ANSYS Full Mechanical Version 7.0 installed on selected CH2M Hill Hanford Group (CH2M HILL) Intel processor based computers include the ability to solve static and dynamic structural analyses, steady-state and transient heat transfer problems, mode-frequency andmore » buckling eigenvalue problems, static or time-varying magnetic analyses and various types of field and coupled-field applications. The program contains many special features which allow nonlinearities or secondary effects to be included in the solution, such as plasticity, large strain, hyperelasticity, creep, swelling, large deflections, contact, stress stiffening, temperature dependency, material anisotropy, and thermal radiation. The ANSYS program has been in commercial use since 1970, and has been used extensively in the aerospace, automotive, construction, electronic, energy services, manufacturing, nuclear, plastics, oil and steel industries.« less
NASA Astrophysics Data System (ADS)
Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee
2017-06-01
We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.
Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee
2017-01-01
We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening. PMID:28621308
DDGui, a new and fast way to analyse DRAGON and DONJON code results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambon, R.; Marleau, G.
2012-07-01
With the largely increased performance of computer, the results from DRAGON and DONJON have increase in size and complexity. The scroll, copy and paste technique to get the result is not appropriate anymore. Many in-house script, software, macro have been developed to make the data gathering easier. However, the limit of these solutions is their specificity and the difficulty to export them from one place to another. A general tool usable and accessible by everyone was needed. The first bricks for a very fast and intuitive way to analyse the DRAGON and DONJON results have been put together in themore » graphic user interface DDGUI. Based on the extensive ROOT C++ package, the possible features are numerous. For this first version of the software, we have programmed the fundamental tools which may be the more useful on an everyday basis: view the data structures content, draw the geometry and draw the flux or power from a DONJON computation. The tests show how amazingly fast the user can get the information needed for a general overview or more precise analyses. Several other features will be implemented in the near feature. (authors)« less
Deciphering functional glycosaminoglycan motifs in development.
Townley, Robert A; Bülow, Hannes E
2018-03-23
Glycosaminoglycans (GAGs) such as heparan sulfate, chondroitin/dermatan sulfate, and keratan sulfate are linear glycans, which when attached to protein backbones form proteoglycans. GAGs are essential components of the extracellular space in metazoans. Extensive modifications of the glycans such as sulfation, deacetylation and epimerization create structural GAG motifs. These motifs regulate protein-protein interactions and are thereby repsonsible for many of the essential functions of GAGs. This review focusses on recent genetic approaches to characterize GAG motifs and their function in defined signaling pathways during development. We discuss a coding approach for GAGs that would enable computational analyses of GAG sequences such as alignments and the computation of position weight matrices to describe GAG motifs. Copyright © 2018 Elsevier Ltd. All rights reserved.
A method for interactive satellite failure diagnosis: Towards a connectionist solution
NASA Technical Reports Server (NTRS)
Bourret, P.; Reggia, James A.
1989-01-01
Various kinds of processes which allow one to make a diagnosis are analyzed. The analyses then focuses on one of these processes used for satellite failure diagnosis. This process consists of sending the satellite instructions about system status alterations: to mask the effects of one possible component failure or to look for additional abnormal measures. A formal model of this process is given. This model is an extension of a previously defined connectionist model which allows computation of ratios between the likelihoods of observed manifestations according to various diagnostic hypotheses. The expected mean value of these likelihood measures for each possible status of the satellite can be computed in a similar way. Therefore, it is possible to select the most appropriate status according to three different purposes: to confirm an hypothesis, to eliminate an hypothesis, or to choose between two hypotheses. Finally, a first connectionist schema of computation of these expected mean values is given.
Images as drivers of progress in cardiac computational modelling
Lamata, Pablo; Casero, Ramón; Carapella, Valentina; Niederer, Steve A.; Bishop, Martin J.; Schneider, Jürgen E.; Kohl, Peter; Grau, Vicente
2014-01-01
Computational models have become a fundamental tool in cardiac research. Models are evolving to cover multiple scales and physical mechanisms. They are moving towards mechanistic descriptions of personalised structure and function, including effects of natural variability. These developments are underpinned to a large extent by advances in imaging technologies. This article reviews how novel imaging technologies, or the innovative use and extension of established ones, integrate with computational models and drive novel insights into cardiac biophysics. In terms of structural characterization, we discuss how imaging is allowing a wide range of scales to be considered, from cellular levels to whole organs. We analyse how the evolution from structural to functional imaging is opening new avenues for computational models, and in this respect we review methods for measurement of electrical activity, mechanics and flow. Finally, we consider ways in which combined imaging and modelling research is likely to continue advancing cardiac research, and identify some of the main challenges that remain to be solved. PMID:25117497
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca; University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213; University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respondmore » over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual specimen. Low predicted bone density was lower than actual specimen. Differences were probably due to applied muscle and joint reaction loads, boundary conditions, and values of constants used. Work is underway to study this. Nonetheless, the results demonstrate three dimensional bone remodeling simulation validity and potential. Such adaptive predictions take physiological bone remodeling simulations one step closer to reality. Computational analyses are needed that integrate biological remodeling rules and predict how bone will respond over time. We expect the combination of computational static stress analyses together with adaptive bone remodeling simulations to become effective tools for regenerative medicine research.« less
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
NASA Astrophysics Data System (ADS)
Sharma, Gulshan B.; Robertson, Douglas D.
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual specimen. Low predicted bone density was lower than actual specimen. Differences were probably due to applied muscle and joint reaction loads, boundary conditions, and values of constants used. Work is underway to study this. Nonetheless, the results demonstrate three dimensional bone remodeling simulation validity and potential. Such adaptive predictions take physiological bone remodeling simulations one step closer to reality. Computational analyses are needed that integrate biological remodeling rules and predict how bone will respond over time. We expect the combination of computational static stress analyses together with adaptive bone remodeling simulations to become effective tools for regenerative medicine research.
Soto, Fabian A; Zheng, Emily; Fonseca, Johnny; Ashby, F Gregory
2017-01-01
Determining whether perceptual properties are processed independently is an important goal in perceptual science, and tools to test independence should be widely available to experimental researchers. The best analytical tools to test for perceptual independence are provided by General Recognition Theory (GRT), a multidimensional extension of signal detection theory. Unfortunately, there is currently a lack of software implementing GRT analyses that is ready-to-use by experimental psychologists and neuroscientists with little training in computational modeling. This paper presents grtools , an R package developed with the explicit aim of providing experimentalists with the ability to perform full GRT analyses using only a couple of command lines. We describe the software and provide a practical tutorial on how to perform each of the analyses available in grtools . We also provide advice to researchers on best practices for experimental design and interpretation of results when applying GRT and grtools .
Multidisciplinary Design Optimization of a Full Vehicle with High Performance Computing
NASA Technical Reports Server (NTRS)
Yang, R. J.; Gu, L.; Tho, C. H.; Sobieszczanski-Sobieski, Jaroslaw
2001-01-01
Multidisciplinary design optimization (MDO) of a full vehicle under the constraints of crashworthiness, NVH (Noise, Vibration and Harshness), durability, and other performance attributes is one of the imperative goals for automotive industry. However, it is often infeasible due to the lack of computational resources, robust simulation capabilities, and efficient optimization methodologies. This paper intends to move closer towards that goal by using parallel computers for the intensive computation and combining different approximations for dissimilar analyses in the MDO process. The MDO process presented in this paper is an extension of the previous work reported by Sobieski et al. In addition to the roof crush, two full vehicle crash modes are added: full frontal impact and 50% frontal offset crash. Instead of using an adaptive polynomial response surface method, this paper employs a DOE/RSM method for exploring the design space and constructing highly nonlinear crash functions. Two NMO strategies are used and results are compared. This paper demonstrates that with high performance computing, a conventionally intractable real world full vehicle multidisciplinary optimization problem considering all performance attributes with large number of design variables become feasible.
Mechanistic design concepts for conventional flexible pavements
NASA Astrophysics Data System (ADS)
Elliott, R. P.; Thompson, M. R.
1985-02-01
Mechanical design concepts for convetional flexible pavement (asphalt concrete (AC) surface plus granular base/subbase) for highways are proposed and validated. The procedure is based on ILLI-PAVE, a stress dependent finite element computer program, coupled with appropriate transfer functions. Two design criteria are considered: AC flexural fatigue cracking and subgrade rutting. Algorithms were developed relating pavement response parameters (stresses, strains, deflections) to AC thickness, AC moduli, granular layer thickness, and subgrade moduli. Extensive analyses of the AASHO Road Test flexible pavement data are presented supporting the validity of the proposed concepts.
Perri, Romina; Huta, Veronika; Pinchuk, Leonard; Pinchuk, Cindy; Ostry, David J; Lund, James P
2008-09-01
To determine if temporomandibular joint disorders (TMDs) are associated with extended computer use. People with chronic pain and extensive computer use were recruited by means of a newspaper advertisement. Those who responded to the ad were asked to complete an online survey, which included questions on computer use, medical history, pain symptoms, lifestyle and mood. Ninety-two people completed the online survey, but none of them responded to all questions in the survey. Of the 88 respondents who reported their sex, 49 (56%) were female. Most of the respondents had used computers for more than 5 hours per day for more than 5 years, and most believed that their pain was linked to computer use. The great majority had pain in the neck (73/89 [82%]) or shoulder (67/89 [75%]), but many (40/91 [44%]) also had symptoms of TMD. About half of the participants reported poor sleep and fatigue, and many linked their pain to negative effects on lifestyle and poor quality of life. Two multiple regressions, with duration of pain as the dependent variable, were carried out, one using the entire sample of respondents who had completed the necessary sections of the survey (n = 91) and the other using the subset of people with symptoms suggestive of TMD (n = 40). Duration of computer use was associated with duration of pain in both analyses, but 6 other independent variables (injury or arthritis, hours of daily computer use, stress, position of computer screen relative to the eyes, sex, and age) were without effect. In these regression analyses, the intercept was close to 0 years, which suggests that the pain began at about the same time as computer use. This web-based survey provides the first evidence that chronic pain in jaw muscles and other symptoms of TMD are associated with long-term, heavy use of computers. However, the great majority of people with these symptoms probably also suffer from pain in the shoulder and neck.
48 CFR 6302.6 - Computation and extension of time limits (Rule 6).
Code of Federal Regulations, 2010 CFR
2010-10-01
... of time limits (Rule 6). 6302.6 Section 6302.6 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION BOARD OF CONTRACT APPEALS RULES OF PROCEDURE 6302.6 Computation and extension of time limits (Rule 6). (a) Computation. Except as otherwise provided by law, in computing any period of time prescribed...
The Silicon Trypanosome: a test case of iterative model extension in systems biology
Achcar, Fiona; Fadda, Abeer; Haanstra, Jurgen R.; Kerkhoven, Eduard J.; Kim, Dong-Hyun; Leroux, Alejandro E.; Papamarkou, Theodore; Rojas, Federico; Bakker, Barbara M.; Barrett, Michael P.; Clayton, Christine; Girolami, Mark; Luise Krauth-Siegel, R.; Matthews, Keith R.; Breitling, Rainer
2016-01-01
The African trypanosome, Trypanosoma brucei, is a unicellular parasite causing African Trypanosomiasis (sleeping sickness in humans and nagana in animals). Due to some of its unique properties, it has emerged as a popular model organism in systems biology. A predictive quantitative model of glycolysis in the bloodstream form of the parasite has been constructed and updated several times. The Silicon Trypanosome (SilicoTryp) is a project that brings together modellers and experimentalists to improve and extend this core model with new pathways and additional levels of regulation. These new extensions and analyses use computational methods that explicitly take different levels of uncertainty into account. During this project, numerous tools and techniques have been developed for this purpose, which can now be used for a wide range of different studies in systems biology. PMID:24797926
A State of the Art Survey of Fraud Detection Technology
NASA Astrophysics Data System (ADS)
Flegel, Ulrich; Vayssière, Julien; Bitz, Gunter
With the introduction of IT to conductbusiness we accepted the loss of a human control step.For this reason, the introductionof newIT systemswas accompanied by the development of the authorization concept. But since, in reality, there is no such thing as 100 per cent security; auditors are commissioned to examine all transactions for misconduct. Since the data exists in digital form already, it makes sense to use computer-based processes to analyse it. Such processes allow the auditor to carry out extensive checks within an acceptable timeframe and with reasonable effort. Once the algorithm has been defined, it only takes sufficient computing power to evaluate larger quantities of data. This contribution presents the state of the art for IT-based data analysis processes that can be used to identify fraudulent activities.
Fischer, Curt R.; Ruebel, Oliver; Bowen, Benjamin P.
2015-09-11
Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web – even when larger than 50more » GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, CR; Ruebel, O; Bowen, BP
Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web - even when larger than 50 GB.more » Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, Curt R.; Ruebel, Oliver; Bowen, Benjamin P.
Mass spectrometry imaging (MSI) is used in an increasing number of biological applications. Typical MSI datasets contain unique, high-resolution mass spectra from tens of thousands of spatial locations, resulting in raw data sizes of tens of gigabytes per sample. In this paper, we review technical progress that is enabling new biological applications and that is driving an increase in the complexity and size of MSI data. Handling such data often requires specialized computational infrastructure, software, and expertise. OpenMSI, our recently described platform, makes it easy to explore and share MSI datasets via the web – even when larger than 50more » GB. Here we describe the integration of OpenMSI with IPython notebooks for transparent, sharable, and replicable MSI research. An advantage of this approach is that users do not have to share raw data along with analyses; instead, data is retrieved via OpenMSI's web API. The IPython notebook interface provides a low-barrier entry point for data manipulation that is accessible for scientists without extensive computational training. Via these notebooks, analyses can be easily shared without requiring any data movement. We provide example notebooks for several common MSI analysis types including data normalization, plotting, clustering, and classification, and image registration.« less
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
Evaluation of a computational model to predict elbow range of motion
Nishiwaki, Masao; Johnson, James A.; King, Graham J. W.; Athwal, George S.
2014-01-01
Computer models capable of predicting elbow flexion and extension range of motion (ROM) limits would be useful for assisting surgeons in improving the outcomes of surgical treatment of patients with elbow contractures. A simple and robust computer-based model was developed that predicts elbow joint ROM using bone geometries calculated from computed tomography image data. The model assumes a hinge-like flexion-extension axis, and that elbow passive ROM limits can be based on terminal bony impingement. The model was validated against experimental results with a cadaveric specimen, and was able to predict the flexion and extension limits of the intact joint to 0° and 3°, respectively. The model was also able to predict the flexion and extension limits to 1° and 2°, respectively, when simulated osteophytes were inserted into the joint. Future studies based on this approach will be used for the prediction of elbow flexion-extension ROM in patients with primary osteoarthritis to help identify motion-limiting hypertrophic osteophytes, and will eventually permit real-time computer-assisted navigated excisions. PMID:24841799
Hoehenwarter, Wolfgang; Larhlimi, Abdelhalim; Hummel, Jan; Egelhofer, Volker; Selbig, Joachim; van Dongen, Joost T; Wienkoop, Stefanie; Weckwerth, Wolfram
2011-07-01
Mass Accuracy Precursor Alignment is a fast and flexible method for comparative proteome analysis that allows the comparison of unprecedented numbers of shotgun proteomics analyses on a personal computer in a matter of hours. We compared 183 LC-MS analyses and more than 2 million MS/MS spectra and could define and separate the proteomic phenotypes of field grown tubers of 12 tetraploid cultivars of the crop plant Solanum tuberosum. Protein isoforms of patatin as well as other major gene families such as lipoxygenase and cysteine protease inhibitor that regulate tuber development were found to be the primary source of variability between the cultivars. This suggests that differentially expressed protein isoforms modulate genotype specific tuber development and the plant phenotype. We properly assigned the measured abundance of tryptic peptides to different protein isoforms that share extensive stretches of primary structure and thus inferred their abundance. Peptides unique to different protein isoforms were used to classify the remaining peptides assigned to the entire subset of isoforms based on a common abundance profile using multivariate statistical procedures. We identified nearly 4000 proteins which we used for quantitative functional annotation making this the most extensive study of the tuber proteome to date.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Nguyen-Kim, Thi Dan Linh; Maurer, Britta; Suliman, Yossra A; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas
2018-04-01
To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051-0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients.
Saline water in southeastern New Mexico
Hiss, W.L.; Peterson, J.B.; Ramsey, T.R.
1969-01-01
Saline waters from formations of several geologic ages are being studied in a seven-county area in southeastern New Mexico and western Texas, where more than 30,000 oil and gas tests have been drilled in the past 40 years. This area of 7,500 sq. miles, which is stratigraphically complex, includes the northern and eastern margins of the Delaware Basin between the Guadalupe and Glass Mountains. Chloride-ion concentrations in water produced from rocks of various ages and depths have been mapped in Lea County, New Mexico, using machine map-plotting techniques and trend analyses. Anomalously low chloride concentrations (1,000-3,000 mg/l) were found along the western margin of the Central Basin platform in the San Andres and Capitan Limestone Formations of Permian age. These low chloride-ion concentrations may be due to preferential circulation of ground water through the more porous and permeable rocks. Data being used in the study were obtained principally from oil companies and from related service companies. The P.B.W.D.S. (Permian Basin Well Data System) scout-record magnetic-tape file was used as a framework in all computer operations. Shallow or non-oil-field water analyses acquired from state, municipal, or federal agencies were added to these data utilizing P.B.W.D.S.-compatible reference numbers and decimal latitude-longitude coordinates. Approximately 20,000 water analyses collected from over 65 sources were coded, recorded on punch cards and stored on magnetic tape for computer operations. Extensive manual and computer error checks for duplication and accuracy were made to eliminate data errors resulting from poorly located or identified samples; non-representative or contaminated samples; mistakes in coding, reproducing or key-punching; laboratory errors; and inconsistent reporting. The original 20,000 analyses considered were reduced to 6,000 representative analyses which are being used in the saline water studies. ?? 1969.
Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body
NASA Technical Reports Server (NTRS)
Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.
2005-01-01
The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.
Navier-Stokes Computations of Longitudinal Forces and Moments for a Blended Wing Body
NASA Technical Reports Server (NTRS)
Pao, S. Paul; Biedron, Robert T.; Park, Michael A.; Fremaux, C. Michael; Vicroy, Dan D.
2004-01-01
The object of this paper is to investigate the feasibility of applying CFD methods to aerodynamic analyses for aircraft stability and control. The integrated aerodynamic parameters used in stability and control, however, are not necessarily those extensively validated in the state of the art CFD technology. Hence, an exploratory study of such applications and the comparison of the solutions to available experimental data will help to assess the validity of the current computation methods. In addition, this study will also examine issues related to wind tunnel measurements such as measurement uncertainty and support interference effects. Several sets of experimental data from the NASA Langley 14x22-Foot Subsonic Tunnel and the National Transonic Facility are presented. Two Navier-Stokes flow solvers, one using structured meshes and the other unstructured meshes, were used to compute longitudinal static stability derivatives for an advanced Blended Wing Body configuration over a wide range of angles of attack. The computations were performed for two different Reynolds numbers and the resulting forces and moments are compared with the above mentioned wind tunnel data.
Model-Based and Model-Free Pavlovian Reward Learning: Revaluation, Revision and Revelation
Dayan, Peter; Berridge, Kent C.
2014-01-01
Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation. PMID:24647659
Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications
NASA Technical Reports Server (NTRS)
Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet
2004-01-01
The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).
Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.
Dayan, Peter; Berridge, Kent C
2014-06-01
Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
From sound to syntax: phonological constraints on children's lexical categorization of new words.
Fitneva, Stanka A; Christiansen, Morten H; Monaghan, Padraic
2009-11-01
Two studies examined the role of phonological cues in the lexical categorization of new words when children could also rely on learning by exclusion and whether the role of phonology depends on extensive experience with a language. Phonological cues were assessed via phonological typicality - an aggregate measure of the relationship between the phonology of a word and the phonology of words in the same lexical class. Experiment 1 showed that when monolingual English-speaking seven-year-olds could rely on learning by exclusion, phonological typicality only affected their initial inferences about the words. Consistent with recent computational analyses, phonological cues had stronger impact on the processing of verb-like than noun-like items. Experiment 2 revealed an impact of French on the performance of seven-year-olds in French immersion when tested in a French language environment. Thus, phonological knowledge may affect lexical categorization even in the absence of extensive experience.
EggLib: processing, analysis and simulation tools for population genetics and genomics
2012-01-01
Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792
EggLib: processing, analysis and simulation tools for population genetics and genomics.
De Mita, Stéphane; Siol, Mathieu
2012-04-11
With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.
Liu, Danping; Yeung, Edwina H; McLain, Alexander C; Xie, Yunlong; Buck Louis, Germaine M; Sundaram, Rajeshwari
2017-09-01
Imperfect follow-up in longitudinal studies commonly leads to missing outcome data that can potentially bias the inference when the missingness is nonignorable; that is, the propensity of missingness depends on missing values in the data. In the Upstate KIDS Study, we seek to determine if the missingness of child development outcomes is nonignorable, and how a simple model assuming ignorable missingness would compare with more complicated models for a nonignorable mechanism. To correct for nonignorable missingness, the shared random effects model (SREM) jointly models the outcome and the missing mechanism. However, the computational complexity and lack of software packages has limited its practical applications. This paper proposes a novel two-step approach to handle nonignorable missing outcomes in generalized linear mixed models. We first analyse the missing mechanism with a generalized linear mixed model and predict values of the random effects; then, the outcome model is fitted adjusting for the predicted random effects to account for heterogeneity in the missingness propensity. Extensive simulation studies suggest that the proposed method is a reliable approximation to SREM, with a much faster computation. The nonignorability of missing data in the Upstate KIDS Study is estimated to be mild to moderate, and the analyses using the two-step approach or SREM are similar to the model assuming ignorable missingness. The two-step approach is a computationally straightforward method that can be conducted as sensitivity analyses in longitudinal studies to examine violations to the ignorable missingness assumption and the implications relative to health outcomes. © 2017 John Wiley & Sons Ltd.
Ball, James W.; Nordstrom, D. Kirk; Jenne, Everett A.
1980-01-01
A computerized chemical model, WATEQ2, has resulted from extensive additions to and revision of the WATEQ model of Truesdell and Jones (Truesdell, A. H., and Jones, B. F., 1974, WATEQ, a computer program for calculating chemical equilibria of natural waters: J. Res. U. S. Geol, Survey, v. 2, p. 233-274). The model building effort has necessitated searching the literature and selecting thermochemical data pertinent to the reactions added to the model. This supplementary report manes available the details of the reactions added to the model together with the selected thermochemical data and their sources. Also listed are details of program operation and a brief description of the output of the model. Appendices-contain a glossary of identifiers used in the PL/1 computer code, the complete PL/1 listing, and sample output from three water analyses used as test cases.
NASA Astrophysics Data System (ADS)
Labbé, D. F. L.; Wilson, P. A.
2007-11-01
The numerical prediction of vortex-induced vibrations has been the focus of numerous investigations to date using tools such as computational fluid dynamics. In particular, the flow around a circular cylinder has raised much attention as it is present in critical engineering problems such as marine cables or risers. Limitations due to the computational cost imposed by the solution of a large number of equations have resulted in the study of mostly 2-D flows with only a few exceptions. The discrepancies found between experimental data and 2-D numerical simulations suggested that 3-D instabilities occurred in the wake of the cylinder that affect substantially the characteristics of the flow. The few 3-D numerical solutions available in the literature confirmed such a hypothesis. In the present investigation the effect of the spanwise extension of the solution domain on the 3-D wake of a circular cylinder is investigated for various Reynolds numbers between 40 and 1000. By assessing the minimum spanwise extension required to predict accurately the flow around a circular cylinder, the infinitely long cylinder is reduced to a finite length cylinder, thus making numerical solution an effective way of investigating flows around circular cylinders. Results are presented for three different spanwise extensions, namely πD/2, πD and 2πD. The analysis of the force coefficients obtained for the various Reynolds numbers together with a visualization of the three-dimensionalities in the wake of the cylinder allowed for a comparison between the effects of the three spanwise extensions. Furthermore, by showing the different modes of vortex shedding present in the wake and by analysing the streamwise components of the vorticity, it was possible to estimate the spanwise wavelengths at the various Reynolds numbers and to demonstrate that a finite spanwise extension is sufficient to accurately predict the flow past an infinitely long circular cylinder.
Large Advanced Space Systems (LASS) computer-aided design program additions
NASA Technical Reports Server (NTRS)
Farrell, C. E.
1982-01-01
The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.
1999-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.
2000-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
Redundancy checking algorithms based on parallel novel extension rule
NASA Astrophysics Data System (ADS)
Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai
2017-05-01
Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.
Serial sectioning of grain microstructures under junction control: An old problem in a new guise
NASA Astrophysics Data System (ADS)
Zöllner, D.; Streitenberger, P.
2015-04-01
In the present work the importance of 3D and 4D microstructure analyses are shown. To that aim, we study polycrystalline grain microstructures obtained by grain growth under grain boundary, triple line and quadruple point control. The microstructures themselves are obtained by mesoscopic computer simulations, which enjoy a far greater control over the kinetic and thermodynamic parameters affecting grain growth than can be realized experimentally. In extensive simulation studies we find by 3D respectively 4D microstructure analyses that metrical and topological properties of the microstructures depend strongly on the microstructural feature controlling the growth kinetics. However, the differences between the growth kinetics vanish when we look at classical 2D sections of the 3D ensembles making a differentiation of the controlling grain feature near impossible.
Mullaji, Arun; Sharma, Amit; Marawar, Satyajit; Kanna, Raj
2009-08-01
A novel sequence of posteromedial release consistent with surgical technique of total knee arthroplasty was performed in 15 cadaveric knees. Medial and lateral flexion and extension gaps were measured after each step of the release using a computed tomography-free computer navigation system. A spring-loaded distractor and a manual distractor were used to distract the joint. Posterior cruciate ligament release increased flexion more than extension gap; deep medial collateral ligament release had a negligible effect; semimembranosus release increased the flexion gap medially; reduction osteotomy increased medial flexion and extension gaps; superficial medial collateral ligament release increased medial joint gap more in flexion and caused severe instability. This sequence of release led to incremental and differential effects on flexion-extension gaps and has implications in correcting varus deformity.
A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions
Xu, Kuan; Austin, Anthony P.; Wei, Ke
2017-12-21
In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.
A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Kuan; Austin, Anthony P.; Wei, Ke
In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.
Bayet, Laurie; Pascalis, Olivier; Quinn, Paul C.; Lee, Kang; Gentaz, Édouard; Tanaka, James W.
2015-01-01
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5–6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1–2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. PMID:25859238
Potential evapotranspiration and continental drying
Milly, Paul C.D.; Dunne, Krista A.
2016-01-01
By various measures (drought area and intensity, climatic aridity index, and climatic water deficits), some observational analyses have suggested that much of the Earth’s land has been drying during recent decades, but such drying seems inconsistent with observations of dryland greening and decreasing pan evaporation. ‘Offline’ analyses of climate-model outputs from anthropogenic climate change (ACC) experiments portend continuation of putative drying through the twenty-first century, despite an expected increase in global land precipitation. A ubiquitous increase in estimates of potential evapotranspiration (PET), driven by atmospheric warming, underlies the drying trends, but may be a methodological artefact. Here we show that the PET estimator commonly used (the Penman–Monteith PET for either an open-water surface or a reference crop) severely overpredicts the changes in non-water-stressed evapotranspiration computed in the climate models themselves in ACC experiments. This overprediction is partially due to neglect of stomatal conductance reductions commonly induced by increasing atmospheric CO2 concentrations in climate models. Our findings imply that historical and future tendencies towards continental drying, as characterized by offline-computed runoff, as well as other PET-dependent metrics, may be considerably weaker and less extensive than previously thought.
NASA Astrophysics Data System (ADS)
Howett, Carly; Irwin, P. G.; Teanby, N.; Calcutt, S. B.; Lolachi, R.; Bowles, N.; Schofield, J. T.; McCleese, D. J.
2007-10-01
Mars Climate Sounder data from September to November 2006 is analysed to determine the effect of scattering upon the retrieved dust opacity in the atmosphere of Mars. The inclusion of scattering in dust retrievals makes them significantly more computationally expensive. Thus, understanding the regimes in which scattering plays a less significant role could considerably decrease the computational time of analysing the extensive MCS dataset. Temperature profiles were initially retrieved using Nemesis, Oxford University's multivariate retrieval algorithm, at each location using MCS' A1, A2 and A3 channels (595 to 665 cm-1 ).Using these temperature profiles, and by assuming the characteristics of the dust particles to be comparable to those of Wolff and Clancy (2003), the dust opacity was retrieved using the B1 channel of MCS (290 to 340 cm-1 ) with and without scattering. The effect of scattering on the fit to the MCS data and on the derived vertical dust profile at various locations across the planet are presented. Particular emphasis is placed upon understanding the spatial and temporal variations of atmospheric regimes in which scattering plays a significant role.
OpenFlow Extensions for Programmable Quantum Networks
2017-06-19
Extensions for Programmable Quantum Networks by Venkat Dasari, Nikolai Snow, and Billy Geerhart Computational and Information Sciences Directorate...distribution is unlimited. 1 1. Introduction Quantum networks and quantum computing have been receiving a surge of interest recently.1–3 However, there has...communicate using entangled particles and perform calculations using quantum logic gates. Additionally, quantum computing uses a quantum bit (qubit
Designs and performance of three new microprocessor-controlled knee joints.
Thiele, Julius; Schöllig, Christina; Bellmann, Malte; Kraft, Marc
2018-02-09
A crossover design study with a small group of subjects was used to evaluate the performance of three microprocessor-controlled exoprosthetic knee joints (MPKs): C-Leg 4, Plié 3 and Rheo Knee 3. Given that the mechanical designs and control algorithms of the joints determine the user outcome, the influence of these inherent differences on the functional characteristics was investigated in this study. The knee joints were evaluated during level-ground walking at different velocities in a motion analysis laboratory. Additionally, technical analyses using patents, technical documentations and X-ray computed tomography (CT) for each knee joint were performed. The technical analyses showed that only C-Leg 4 and Rheo Knee 3 allow microprocessor-controlled adaptation of the joint resistances for different gait velocities. Furthermore, Plié 3 is not able to provide stance extension damping. The biomechanical results showed that only if a knee joint adapts flexion and extension resistances by the microprocessor all known advantages of MPKs can become apparent. But not all users may benefit from the examined functions: e.g. a good accommodation to fast walking speeds or comfortable stance phase flexion. Hence, a detailed comparison of user demands and performance of the designated knee joint is mandatory to ensure a maximum in user outcome.
SECOND GENERATION MODEL | Science Inventory | US ...
One of the environmental and economic models that the U.S. EPA uses to assess climate change policies is the Second Generation Model (SGM). SGM is a 13 region, 24 sector computable general equilibrium (CGE) model of the world that can be used to estimate the domestic and international economic impacts of policies designed to reduce greenhouse gas emissions. SGM was developed by Jae Edmonds and others at the Joint Global Change Research Institute (JGCRI) of Pacific Northwest National Laboratory (PNNL) and the University of Maryland. One of SGM's primary purposes is to provide an integrated assessment of a portfolio of greenhouse gas mitigation strategies. The SGM projects economic activity, energy transformation and consumption, and greenhouse gas emissions for each region of the globe in five-year time steps from 1990 through 2050. The model has been used extensively over the last decade to assess U.S. policy options to achieve greenhouse gas mitigation goals. The SGM is one of EPA's primary tools for analyses of climate change policies. It was used extensively by the the U.S. government to analyze the impact of the Kyoto Protocol. Moreover, the SGM has been used by EPA during the current Administration for analyses of the climate components of various multi-emissions bills.
NASA Astrophysics Data System (ADS)
Altenkamp, Lukas; Boggia, Michele; Dittmaier, Stefan
2018-04-01
We consider an extension of the Standard Model by a real singlet scalar field with a ℤ2-symmetric Lagrangian and spontaneous symmetry breaking with vacuum expectation value for the singlet. Considering the lighter of the two scalars of the theory to be the 125 GeV Higgs particle, we parametrize the scalar sector by the mass of the heavy Higgs boson, a mixing angle α, and a scalar Higgs self-coupling λ 12. Taking into account theoretical constraints from perturbativity and vacuum stability, we compute next-to-leading-order electroweak and QCD corrections to the decays h → WW/ZZ → 4 fermions of the light Higgs boson for some scenarios proposed in the literature. We formulate two renormalization schemes and investigate the conversion of the input parameters between the schemes, finding sizeable effects. Solving the renormalization-group equations for the \\overline{MS} parameters α and λ 12, we observe a significantly reduced scale and scheme dependence in the next-to-leading-order results. For some scenarios suggested in the literature, the total decay width for the process h → 4 f is computed as a function of the mixing angle and compared to the width of a corresponding Standard Model Higgs boson, revealing deviations below 10%. Differential distributions do not show significant distortions by effects beyond the Standard Model. The calculations are implemented in the Monte Carlo generator P rophecy4 f, which is ready for applications in data analyses in the framework of the singlet extension.
CUGatesDensity—Quantum circuit analyser extended to density matrices
NASA Astrophysics Data System (ADS)
Loke, T.; Wang, J. B.
2013-12-01
CUGatesDensity is an extension of the original quantum circuit analyser CUGates (Loke and Wang, 2011) [7] to provide explicit support for the use of density matrices. The new package enables simulation of quantum circuits involving statistical ensemble of mixed quantum states. Such analysis is of vital importance in dealing with quantum decoherence, measurements, noise and error correction, and fault tolerant computation. Several examples involving mixed state quantum computation are presented to illustrate the use of this package. Catalogue identifier: AEPY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5368 No. of bytes in distributed program, including test data, etc.: 143994 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer installed with a copy of Mathematica 6.0 or higher. Operating system: Any system with a copy of Mathematica 6.0 or higher installed. Classification: 4.15. Nature of problem: To simulate arbitrarily complex quantum circuits comprised of single/multiple qubit and qudit quantum gates with mixed state registers. Solution method: A density matrix representation for mixed states and a state vector representation for pure states are used. The construct is based on an irreducible form of matrix decomposition, which allows a highly efficient implementation of general controlled gates with multiple conditionals. Running time: The examples provided in the notebook CUGatesDensity.nb take approximately 30 s to run on a laptop PC.
A simple Lagrangian forecast system with aviation forecast potential
NASA Technical Reports Server (NTRS)
Petersen, R. A.; Homan, J. H.
1983-01-01
A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.
An automated data management/analysis system for space shuttle orbiter tiles. [stress analysis
NASA Technical Reports Server (NTRS)
Giles, G. L.; Ballas, M.
1982-01-01
An engineering data management system was combined with a nonlinear stress analysis program to provide a capability for analyzing a large number of tiles on the space shuttle orbiter. Tile geometry data and all data necessary of define the tile loads environment accessed automatically as needed for the analysis of a particular tile or a set of tiles. User documentation provided includes: (1) description of computer programs and data files contained in the system; (2) definitions of all engineering data stored in the data base; (3) characteristics of the tile anaytical model; (4) instructions for preparation of user input; and (5) a sample problem to illustrate use of the system. Description of data, computer programs, and analytical models of the tile are sufficiently detailed to guide extension of the system to include additional zones of tiles and/or additional types of analyses
NASA Technical Reports Server (NTRS)
Krist, Steven E.; Ghaffari, Farhad
2015-01-01
Computational simulations for a Space Launch System configuration at liftoff conditions for incidence angles from 0 to 90 degrees were conducted in order to generate integrated force and moment data and longitudinal lineloads. While the integrated force and moment coefficients can be obtained from wind tunnel testing, computational analyses are indispensable in obtaining the extensive amount of surface information required to generate proper lineloads. However, beyond an incidence angle of about 15 degrees, the effects of massive flow separation on the leeward pressure field is not well captured with state of the art Reynolds Averaged Navier-Stokes methods, necessitating the employment of a Detached Eddy Simulation method. Results from these simulations are compared to the liftoff force and moment database and surface pressure data derived from a test in the NASA Langley 14- by 22-Foot Subsonic Wind Tunnel.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning
O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara
2014-01-01
Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503
T7 lytic phage-displayed peptide libraries: construction and diversity characterization.
Krumpe, Lauren R H; Mori, Toshiyuki
2014-01-01
In this chapter, we describe the construction of T7 bacteriophage (phage)-displayed peptide libraries and the diversity analyses of random amino acid sequences obtained from the libraries. We used commercially available reagents, Novagen's T7Select system, to construct the libraries. Using a combination of biotinylated extension primer and streptavidin-coupled magnetic beads, we were able to prepare library DNA without applying gel purification, resulting in extremely high ligation efficiencies. Further, we describe the use of bioinformatics tools to characterize library diversity. Amino acid frequency and positional amino acid diversity and hydropathy are estimated using the REceptor LIgand Contacts website http://relic.bio.anl.gov. Peptide net charge analysis and peptide hydropathy analysis are conducted using the Genetics Computer Group Wisconsin Package computational tools. A comprehensive collection of the estimated number of recombinants and titers of T7 phage-displayed peptide libraries constructed in our lab is included.
Wang, Anliang; Yan, Xiaolong; Wei, Zhijun
2018-04-27
This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.
Designs and performance of microprocessor-controlled knee joints.
Thiele, Julius; Westebbe, Bettina; Bellmann, Malte; Kraft, Marc
2014-02-01
In this comparative study, three transfemoral amputee subjects were fitted with four different microprocessor-controlled exoprosthetic knee joints (MPK): C-Leg, Orion, Plié2.0, and Rel-K. In a motion analysis laboratory, objective gait measures were acquired during level walking at different velocities. Subsequent technical analyses, which involved X-ray computed tomography, identified the functional mechanisms of each device and enabled corroboration of the performance in the gait laboratory by the engineering design of the MPK. Gait measures showed that the mean increase of the maximum knee flexion angle at different walking velocities was closest in value to the unaffected contralateral knee (6.2°/m/s) with C-Leg (3.5°/m/s; Rel-K 17.0°/m/s, Orion 18.3°/m/s, and Plié2.0 28.1°/m/s). Technical analyses corroborated that only with Plié2.0 the flexion resistances were not regulated by microprocessor control at different walking velocities. The muscular effort for the initiation of the swing phase, measured by the minimum hip moment, was found to be lowest with C-Leg (-82.1±14.1 Nm; Rel-K -83.59±17.8 Nm, Orion -88.0±16.3 Nm, and Plié2.0 -91.6±16.5 Nm). Reaching the extension stop at the end of swing phase was reliably executed with both Plié2.0 and C-Leg. Abrupt terminal stance phase extension observed with Plié2.0 and Rel-K could be attributed to the absence of microprocessor control of extension resistance.
AstroCloud: An Agile platform for data visualization and specific analyzes in 2D and 3D
NASA Astrophysics Data System (ADS)
Molina, F. Z.; Salgado, R.; Bergel, A.; Infante, A.
2017-07-01
Nowadays, astronomers commonly run their own tools, or distributed computational packages, for data analysis and then visualizing the results with generic applications. This chain of processes comes at high cost: (a) analyses are manually applied, they are therefore difficult to be automatized, and (b) data have to be serialized, thus increasing the cost of parsing and saving intermediary data. We are developing AstroCloud, an agile visualization multipurpose platform intended for specific analyses of astronomical images (https://astrocloudy.wordpress.com). This platform incorporates domain-specific languages which make it easily extensible. AstroCloud supports customized plug-ins, which translate into time reduction on data analysis. Moreover, it also supports 2D and 3D rendering, including interactive features in real time. AstroCloud is under development, we are currently implementing different choices for data reduction and physical analyzes.
Visualising Conversation Structure across Time: Insights into Effective Doctor-Patient Consultations
Angus, Daniel; Watson, Bernadette; Smith, Andrew; Gallois, Cindy; Wiles, Janet
2012-01-01
Effective communication between healthcare professionals and patients is critical to patients’ health outcomes. The doctor/patient dialogue has been extensively researched from different perspectives, with findings emphasising a range of behaviours that lead to effective communication. Much research involves self-reports, however, so that behavioural engagement cannot be disentangled from patients’ ratings of effectiveness. In this study we used a highly efficient and time economic automated computer visualisation measurement technique called Discursis to analyse conversational behaviour in consultations. Discursis automatically builds an internal language model from a transcript, mines the transcript for its conceptual content, and generates an interactive visual account of the discourse. The resultant visual account of the whole consultation can be analysed for patterns of engagement between interactants. The findings from this study show that Discursis is effective at highlighting a range of consultation techniques, including communication accommodation, engagement and repetition. PMID:22693629
Sierra/Solid Mechanics 4.48 User's Guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merewether, Mark Thomas; Crane, Nathan K; de Frias, Gabriel Jose
Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutionsmore » of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.« less
40 CFR 305.6 - Computation and extension of time.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 29 2012-07-01 2012-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...
40 CFR 305.6 - Computation and extension of time.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 29 2013-07-01 2013-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...
40 CFR 305.6 - Computation and extension of time.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 28 2011-07-01 2011-07-01 false Computation and extension of time. 305.6 Section 305.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND..., AND LIABILITY ACT (CERCLA) ADMINISTRATIVE HEARING PROCEDURES FOR CLAIMS AGAINST THE SUPERFUND General...
Neck postures in air traffic controllers with and without neck/shoulder disorders.
Arvidsson, Inger; Hansson, Gert-Ake; Mathiassen, Svend Erik; Skerfving, Staffan
2008-03-01
Prolonged computer work with an extended neck is commonly believed to be associated with an increased risk of neck-shoulder disorders. The aim of this study was to compare neck postures during computer work between female cases with neck-shoulder disorders, and healthy referents. Based on physical examinations, 13 cases and 11 referents were selected among 70 female air traffic controllers with the same computer-based work tasks and identical workstations. Postures and movements were measured by inclinometers, placed on the forehead and upper back (C7/Th1) during authentic air traffic control. A recently developed method was applied to assess flexion/extension in the neck, calculated as the difference between head and upper back flexion/extension. cases and referents did not differ significantly in neck posture (median neck flexion/extension: -10 degrees vs. -9 degrees ; p=0.9). Hence, the belief that neck extension posture is associated with neck-shoulder disorders in computer work is not supported by the present data.
Tetley, Robert J; Blanchard, Guy B; Fletcher, Alexander G; Adams, Richard J; Sanson, Bénédicte
2016-01-01
Convergence and extension movements elongate tissues during development. Drosophila germ-band extension (GBE) is one example, which requires active cell rearrangements driven by Myosin II planar polarisation. Here, we develop novel computational methods to analyse the spatiotemporal dynamics of Myosin II during GBE, at the scale of the tissue. We show that initial Myosin II bipolar cell polarization gives way to unipolar enrichment at parasegmental boundaries and two further boundaries within each parasegment, concomitant with a doubling of cell number as the tissue elongates. These boundaries are the primary sites of cell intercalation, behaving as mechanical barriers and providing a mechanism for how cells remain ordered during GBE. Enrichment at parasegment boundaries during GBE is independent of Wingless signaling, suggesting pair-rule gene control. Our results are consistent with recent work showing that a combinatorial code of Toll-like receptors downstream of pair-rule genes contributes to Myosin II polarization via local cell-cell interactions. We propose an updated cell-cell interaction model for Myosin II polarization that we tested in a vertex-based simulation. DOI: http://dx.doi.org/10.7554/eLife.12094.001 PMID:27183005
Hutton, Brian; Salanti, Georgia; Caldwell, Deborah M; Chaimani, Anna; Schmid, Christopher H; Cameron, Chris; Ioannidis, John P A; Straus, Sharon; Thorlund, Kristian; Jansen, Jeroen P; Mulrow, Cynthia; Catalá-López, Ferrán; Gøtzsche, Peter C; Dickersin, Kay; Boutron, Isabelle; Altman, Douglas G; Moher, David
2015-06-02
The PRISMA statement is a reporting guideline designed to improve the completeness of reporting of systematic reviews and meta-analyses. Authors have used this guideline worldwide to prepare their reviews for publication. In the past, these reports typically compared 2 treatment alternatives. With the evolution of systematic reviews that compare multiple treatments, some of them only indirectly, authors face novel challenges for conducting and reporting their reviews. This extension of the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-analyses) statement was developed specifically to improve the reporting of systematic reviews incorporating network meta-analyses. A group of experts participated in a systematic review, Delphi survey, and face-to-face discussion and consensus meeting to establish new checklist items for this extension statement. Current PRISMA items were also clarified. A modified, 32-item PRISMA extension checklist was developed to address what the group considered to be immediately relevant to the reporting of network meta-analyses. This document presents the extension and provides examples of good reporting, as well as elaborations regarding the rationale for new checklist items and the modification of previously existing items from the PRISMA statement. It also highlights educational information related to key considerations in the practice of network meta-analysis. The target audience includes authors and readers of network meta-analyses, as well as journal editors and peer reviewers.
Relationship between coracoacromial arch and rotator cuff analysed by a computer-assisted method.
Casino, Daniela; Bruni, Danilo; Zaffagnini, Stefano; Martelli, Sandra; Visani, Andrea; Alvarez, Pau Golanò; Marcacci, Maurilio
2008-06-01
In this paper we describe and assess the feasibility of a computer-assisted method which could be useful to investigate the mechanism of subacromial impingment of the shoulder. The relationship between the infraspinatus and supraspinatus and the coracoacromial (CA) arch during passive elevation and abduction are described. The methodology is based on the use of a tracker for recording surfaces and passive movements and data elaboration using dedicated software. In four cadavers, we observed that the minimal distances between the rotator cuff insertions and CA arch were realized at 45 degrees abduction between the acromion and infraspinatus, at 50-90 degrees elevation between the acromion and supraspinatus and also at 45-70 degrees abduction between the CA ligament and supraspinatus. This study showed that the proposed method is able to provide repeatable kinematic data (ICC > or = 0.90), numerical anatomical data comparable with the literature and, moreover, individual measurements on the shoulder joint. This preliminary results support the extension of the methodology to an in vivo protocol to be used during computer-assisted arthroscopic surgery. (c) 2008 John Wiley & Sons, Ltd.
Research Area 3: Mathematical Sciences: 3.4, Discrete Mathematics and Computer Science
2015-06-10
013-0043-1 Charles Chui, Hrushikesh Mhaskar. MRA contextual-recovery extension of smooth functions on manifolds, Applied and Computational Harmonic...753507. International Society for Optics and Photonics, 2010. [5] C. K. Chui and H. N. Mhaskar. MRA contextual-recovery extension of smooth functions on
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 3 2014-04-01 2014-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 3 2013-04-01 2013-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 210.6 Section 210.6 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT...
45 CFR 150.429 - Computation of time and extensions of time.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...
45 CFR 150.429 - Computation of time and extensions of time.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Computation of time and extensions of time. 150.429 Section 150.429 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-11
... 23 Post-Hearing Briefs Rule 24 Transcript of Proceedings Rule 25 Withdrawal of Exhibits... from Court TIME, COMPUTATION, AND EXTENSIONS Rule 33 Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34 Ex parte Communications SANCTIONS Rule 35 Sanctions EFFECTIVE DATE AND APPLICABILITY Rule...
Inertial subsystem functional and design requirements for the orbiter (Phase B extension baseline)
NASA Technical Reports Server (NTRS)
Flanders, J. H.; Green, J. P., Jr.
1972-01-01
The design requirements use the Phase B extension baseline system definition. This means that a GNC computer is specified for all command control functions instead of a central computer communicating with the ISS through a databus. Forced air cooling is used instead of cold plate cooling.
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools
NASA Astrophysics Data System (ADS)
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
2015-12-01
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
Maurer, Britta; Suliman, Yossra A.; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas
2018-01-01
Background To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. Methods From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. Results With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051–0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Conclusions Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients. PMID:29850118
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
Performance Improvement Through Indexing of Turbine Airfoils. Part 2; Numerical Simulation
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Huber, Frank W.; Sharma, Om P.
1996-01-01
An experimental/analytical study has been conducted to determine the performance improvements achievable by circumferentially indexing succeeding rows of turbine stator airfoils. A series of tests was conducted to experimentally investigate stator wake clocking effects on the performance of the space shuttle main engine (SSME) alternate turbopump development (ATD) fuel turbine test article (TTA). The results from this study indicate that significant increases in stage efficiency can be attained through application of this airfoil clocking concept. Details of the experiment and its results are documented in part 1 of this paper. In order to gain insight into the mechanisms of the performance improvement, extensive computational fluid dynamics (CFD) simulations were executed. The subject of the present paper is the initial results from the CFD investigation of the configurations and conditions detailed in part 1 of the paper. To characterize the aerodynamic environments in the experimental test series, two-dimensional (2D), time accurate, multistage, viscous analyses were performed at the TTA midspan. Computational analyses for five different circumferential positions of the first stage stator have been completed. Details of the computational procedure and the results are presented. The analytical results verify the experimentally demonstrated performance improvement and are compared with data whenever possible. Predictions of time-averaged turbine efficiencies as well as gas conditions throughout the flow field are presented. An initial understanding of the turbine performance improvement mechanism based on the results from this investigation is described.
Impact of implementation choices on quantitative predictions of cell-based computational models
NASA Astrophysics Data System (ADS)
Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.
2017-09-01
'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.
Halogen Bonding versus Hydrogen Bonding: A Molecular Orbital Perspective
Wolters, Lando P; Bickelhaupt, F Matthias
2012-01-01
We have carried out extensive computational analyses of the structure and bonding mechanism in trihalides DX⋅⋅⋅A− and the analogous hydrogen-bonded complexes DH⋅⋅⋅A− (D, X, A=F, Cl, Br, I) using relativistic density functional theory (DFT) at zeroth-order regular approximation ZORA-BP86/TZ2P. One purpose was to obtain a set of consistent data from which reliable trends in structure and stability can be inferred over a large range of systems. The main objective was to achieve a detailed understanding of the nature of halogen bonds, how they resemble, and also how they differ from, the better understood hydrogen bonds. Thus, we present an accurate physical model of the halogen bond based on quantitative Kohn–Sham molecular orbital (MO) theory, energy decomposition analyses (EDA) and Voronoi deformation density (VDD) analyses of the charge distribution. It appears that the halogen bond in DX⋅⋅⋅A− arises not only from classical electrostatic attraction but also receives substantial stabilization from HOMO–LUMO interactions between the lone pair of A− and the σ* orbital of D–X. PMID:24551497
Integrating Xgrid into the HENP distributed computing model
NASA Astrophysics Data System (ADS)
Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.
2008-07-01
Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.
A novel nonsense mutation in the NDP gene in a Chinese family with Norrie disease.
Liu, Deyuan; Hu, Zhengmao; Peng, Yu; Yu, Changhong; Liu, Yalan; Mo, Xiaoyun; Li, Xiaoping; Lu, Lina; Xu, Xiaojuan; Su, Wei; Pan, Qian; Xia, Kun
2010-12-08
Norrie disease (ND), a rare X-linked recessive disorder, is characterized by congenital blindness and, occasionally, mental retardation and hearing loss. ND is caused by the Norrie Disease Protein gene (NDP), which codes for norrin, a cysteine-rich protein involved in ocular vascular development. Here, we report a novel mutation of NDP that was identified in a Chinese family in which three members displayed typical ND symptoms and other complex phenotypes, such as cerebellar atrophy, motor disorders, and mental disorders. We conducted an extensive clinical examination of the proband and performed a computed tomography (CT) scan of his brain. Additionally, we performed ophthalmic examinations, haplotype analyses, and NDP DNA sequencing for 26 individuals from the proband's extended family. The proband's computed tomography scan, in which the fifth ventricle could be observed, indicated cerebellar atrophy. Genome scans and haplotype analyses traced the disease to chromosome Xp21.1-p11.22. Mutation screening of the NDP gene identified a novel nonsense mutation, c.343C>T, in this region. Although recent research has shown that multiple different mutations can be responsible for the ND phenotype, additional research is needed to understand the mechanism responsible for the diverse phenotypes caused by mutations in the NDP gene.
A novel nonsense mutation in the NDP gene in a Chinese family with Norrie disease
Liu, Deyuan; Hu, Zhengmao; Peng, Yu; Yu, Changhong; Liu, Yalan; Mo, Xiaoyun; Li, Xiaoping; Lu, Lina; Xu, Xiaojuan; Su, Wei; Pan, Qian
2010-01-01
Purpose Norrie disease (ND), a rare X-linked recessive disorder, is characterized by congenital blindness and, occasionally, mental retardation and hearing loss. ND is caused by the Norrie Disease Protein gene (NDP), which codes for norrin, a cysteine-rich protein involved in ocular vascular development. Here, we report a novel mutation of NDP that was identified in a Chinese family in which three members displayed typical ND symptoms and other complex phenotypes, such as cerebellar atrophy, motor disorders, and mental disorders. Methods We conducted an extensive clinical examination of the proband and performed a computed tomography (CT) scan of his brain. Additionally, we performed ophthalmic examinations, haplotype analyses, and NDP DNA sequencing for 26 individuals from the proband’s extended family. Results The proband’s computed tomography scan, in which the fifth ventricle could be observed, indicated cerebellar atrophy. Genome scans and haplotype analyses traced the disease to chromosome Xp21.1-p11.22. Mutation screening of the NDP gene identified a novel nonsense mutation, c.343C>T, in this region. Conclusions Although recent research has shown that multiple different mutations can be responsible for the ND phenotype, additional research is needed to understand the mechanism responsible for the diverse phenotypes caused by mutations in the NDP gene. PMID:21179243
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less
How Extension Can Help Communities Conduct Impact Analyses.
ERIC Educational Resources Information Center
Wisconsin Univ., Madison. Dept. of Agricultural Journalism.
Intended to provide guidance to Extension specialists and agents faced with requests for impact analyses from communities experiencing economic development, this report also summarizes issues that need to be considered. The first section, on private sector impacts, addresses questions on predicting changes in production, employment, and housing…
MetalPDB in 2018: a database of metal sites in biological macromolecular structures.
Putignano, Valeria; Rosato, Antonio; Banci, Lucia; Andreini, Claudia
2018-01-04
MetalPDB (http://metalweb.cerm.unifi.it/) is a database providing information on metal-binding sites detected in the three-dimensional (3D) structures of biological macromolecules. MetalPDB represents such sites as 3D templates, called Minimal Functional Sites (MFSs), which describe the local environment around the metal(s) independently of the larger context of the macromolecular structure. The 2018 update of MetalPDB includes new contents and tools. A major extension is the inclusion of proteins whose structures do not contain metal ions although their sequences potentially contain a known MFS. In addition, MetalPDB now provides extensive statistical analyses addressing several aspects of general metal usage within the PDB, across protein families and in catalysis. Users can also query MetalPDB to extract statistical information on structural aspects associated with individual metals, such as preferred coordination geometries or aminoacidic environment. A further major improvement is the functional annotation of MFSs; the annotation is manually performed via a password-protected annotator interface. At present, ∼50% of all MFSs have such a functional annotation. Other noteworthy improvements are bulk query functionality, through the upload of a list of PDB identifiers, and ftp access to MetalPDB contents, allowing users to carry out in-depth analyses on their own computational infrastructure. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Zakerian, SA; Subramaniam, ID
2011-01-01
Background: With computers rapidly carving a niche in virtually every nook and crevice of today’s fast-paced society, musculoskeletal disorders are becoming more prevalent among computer users, which comprise a wide spectrum of the Malaysian population, including office workers. While extant literature depicts extensive research on musculoskeletal disorders in general, the five dimensions of psychosocial work factors (job demands, job contentment, job control, computer-related problems and social interaction) attributed to work-related musculoskeletal disorders have been neglected. This study examines the aforementioned elements in detail, pertaining to their relationship with musculoskeletal disorders, focusing in particular, on 120 office workers at Malaysian public sector organizations, whose jobs require intensive computer usage. Methods: Research was conducted between March and July 2009 in public service organizations in Malaysia. This study was conducted via a survey utilizing self-complete questionnaires and diary. The relationship between psychosocial work factors and musculoskeletal discomfort was ascertained through regression analyses, which revealed that some factors were more important than others were. Results: The results indicate a significant relationship among psychosocial work factors and musculoskeletal discomfort among computer users. Several of these factors such as job control, computer-related problem and social interaction of psychosocial work factors are found to be more important than others in musculoskeletal discomfort. Conclusion: With computer usage on the rise among users, the prevalence of musculoskeletal discomfort could lead to unnecessary disabilities, hence, the vital need for greater attention to be given on this aspect in the work place, to alleviate to some extent, potential problems in future. PMID:23113058
NASA Astrophysics Data System (ADS)
Magnin, H.; Coulomb, J. L.
1993-03-01
Electromagnetic field computation with the Finite Element (FE) method implies solving of large linear systems of equations. Performances and memory capacities of today computers allow to achieve three-dimensional FE discretizations of electromagnetic problems, but the number of unknowns grows high. So, to improve time to the numerical solution of the linear system(s) thus arising, the use of parallel and/or vector computers has to be envisaged. In this paper, the main constitutive steps of the Pre-conditioned Conjugate Gradient algorithm (PCG) are analysed. After a short recall of our previous work concerning their improvement by use of vector and parallel computations, we show some speedup limitations due to the sparse row-wise matrix storage scheme employed. Then, an extension of this matrix representation is proposed, leading to introduce redundant storage of non-zero coefficients. In spite of the “memory waste” thus implied, it is shown how this extension can be successfully employed to increase the speedup due to parallelism and vectorization on the whole algorithm, and in particular to derive a parallel preconditioner. La résolution par la méthode des éléments finis des équations de l'électromagnétisme conduit à résoudre de grands systèmes d'équations linéaires. Les capacités mémoire et les performances actuelles des systèmes informatiques permettent de traiter les problèmes électromagnétiques par discrétisation tridimensionnelle, mais alors le nombre d'inconnues devient très élevé. Ainsi, la résolution en un temps raisonnable des équations linéaires associées à de telles discrétisations conduit à envisager l'emploi d'ordinateurs à architecture parallèle. Dans cet article, les différentes étapes constitutives de l'algorithme du gradient conjugué préconditionné (GCP) sont analysées. Après un court rappel de nos travaux antérieurs concemant leur amélioration par utilisation de traitements parallèles et vectoriels, nous montrons les limitations du gain de temps dues au mode de stockage matriciel utilisé : la représentation creuse dite “Morse”. Nous proposons alors une extension de ce mode de stockage, conduisant à l'introduction de redondance au niveau du rangement des termes matriciels en mémoire. Malgré le “gaspillage” mémoire ainsi occasionné, il apparait que cette extension peut être mise à profit pour augmenter sensiblement les gains par parallélisation et vectorisation de l'ensemble de l'algorithme du gradient conjugué, et notamment pour la réalisation d'un pré-conditionnement parallèle.
Kimbrow, Dustin R.
2014-01-01
Topographic survey data of areas on Dauphin Island on the Alabama coast were collected using a truck-mounted mobile terrestrial light detection and ranging system. This system is composed of a high frequency laser scanner in conjunction with an inertial measurement unit and a position and orientation computer to produce highly accurate topographic datasets. A global positioning system base station was set up on a nearby benchmark and logged vertical and horizontal position information during the survey for post-processing. Survey control points were also collected throughout the study area to determine residual errors. Data were collected 5 days after Hurricane Isaac made landfall in early September 2012 to document sediment deposits prior to clean-up efforts. Three data files in ASCII text format with the extension .xyz are included in this report, and each file is named according to both the acquisition date and the relative geographic location on Dauphin Island (for example, 20120903_Central.xyz). Metadata are also included for each of the files in both Extensible Markup Language with the extension .xml and ASCII text formats. These topographic data can be used to analyze the effects of storm surge on barrier island environments and also serve as a baseline dataset for future change detection analyses.
Experiment Software and Projects on the Web with VISPA
NASA Astrophysics Data System (ADS)
Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.
2017-10-01
The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.
Computational Models of Rock Failure
NASA Astrophysics Data System (ADS)
May, Dave A.; Spiegelman, Marc
2017-04-01
Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of rock failure suitable for geodynamic studies.
InSAR Scientific Computing Environment - The Home Stretch
NASA Astrophysics Data System (ADS)
Rosen, P. A.; Gurrola, E. M.; Sacco, G.; Zebker, H. A.
2011-12-01
The Interferometric Synthetic Aperture Radar (InSAR) Scientific Computing Environment (ISCE) is a software development effort in its third and final year within the NASA Advanced Information Systems and Technology program. The ISCE is a new computing environment for geodetic image processing for InSAR sensors enabling scientists to reduce measurements directly from radar satellites to new geophysical products with relative ease. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. Upcoming international SAR missions will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment has the functionality to become a key element in processing data from NASA's proposed DESDynI mission into higher level data products, supporting a new class of analyses that take advantage of the long time and large spatial scales of these new data. At the core of ISCE is a new set of efficient and accurate InSAR algorithms. These algorithms are placed into an object-oriented, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. ISCE supports data from nearly all of the available satellite platforms, including ERS, EnviSAT, Radarsat-1, Radarsat-2, ALOS, TerraSAR-X, and Cosmo-SkyMed. The code applies a number of parallelization techniques and sensible approximations for speed. It is configured to work on modern linux-based computers with gcc compilers and python. ISCE is now a complete, functional package, under configuration management, and with extensive documentation and tested use cases appropriate to geodetic imaging applications. The software has been tested with canonical simulated radar data ("point targets") as well as with a variety of existing satellite data, cross-compared with other software packages. Its extensibility has already been proven by the straightforward addition of polarimetric processing and calibration, and derived filtering and estimation routines associated with polarimetry that supplement the original InSAR geodetic functionality. As of October 2011, the software is available for non-commercial use through UNAVCO's WinSAR consortium.
An interactive data management and analysis system for clinical investigators.
Groner, G F; Hopwood, M D; Palley, N A; Sibley, W L; Baker, W R; Christopher, T G; Thompson, H K
1978-09-01
An interactive minicomputer-based system has been developed that enables the clinical research investigator to personally explore and analyze his research data and, as a consequence of these explorations, to acquire more information. This system, which does not require extensive training or computer programming, enables the investigator to describe his data interactively in his own terms, enter data values while having them checked for validity, store time-oriented patient data in a carefully controlled on-line data base, retrieve data by patient, variable, and time, create subsets of patients with common characteristics, perform statistical analyses, and produce tables and graphs. It also permits data to be transferred to and from other computers. The system is well accepted and is being used by a variety of medical specialists at the three clinical research centers where it is operational. Reported benefits include less elapsed and nonproductive time, more thorough analysis of more data, greater and earlier insight into the meaning of research data, and increased publishable results.
Fick, Lambert H.; Merzari, Elia; Hassan, Yassin A.
2017-02-20
Computational analyses of fluid flow through packed pebble bed domains using the Reynolds-averaged NavierStokes framework have had limited success in the past. Because of a lack of high-fidelity experimental or computational data, optimization of Reynolds-averaged closure models for these geometries has not been extensively developed. In the present study, direct numerical simulation was employed to develop a high-fidelity database that can be used for optimizing Reynolds-averaged closure models for pebble bed flows. A face-centered cubic domain with periodic boundaries was used. Flow was simulated at a Reynolds number of 9308 and cross-verified by using available quasi-DNS data. During the simulations,more » low-frequency instability modes were observed that affected the stationary solution. Furthermore, these instabilities were investigated by using the method of proper orthogonal decomposition, and a correlation was found between the time-dependent asymmetry of the averaged velocity profile data and the behavior of the highest energy eigenmodes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fick, Lambert H.; Merzari, Elia; Hassan, Yassin A.
Computational analyses of fluid flow through packed pebble bed domains using the Reynolds-averaged NavierStokes framework have had limited success in the past. Because of a lack of high-fidelity experimental or computational data, optimization of Reynolds-averaged closure models for these geometries has not been extensively developed. In the present study, direct numerical simulation was employed to develop a high-fidelity database that can be used for optimizing Reynolds-averaged closure models for pebble bed flows. A face-centered cubic domain with periodic boundaries was used. Flow was simulated at a Reynolds number of 9308 and cross-verified by using available quasi-DNS data. During the simulations,more » low-frequency instability modes were observed that affected the stationary solution. Furthermore, these instabilities were investigated by using the method of proper orthogonal decomposition, and a correlation was found between the time-dependent asymmetry of the averaged velocity profile data and the behavior of the highest energy eigenmodes.« less
NASA Astrophysics Data System (ADS)
Reis, D. S.; Stedinger, J. R.; Martins, E. S.
2005-10-01
This paper develops a Bayesian approach to analysis of a generalized least squares (GLS) regression model for regional analyses of hydrologic data. The new approach allows computation of the posterior distributions of the parameters and the model error variance using a quasi-analytic approach. Two regional skew estimation studies illustrate the value of the Bayesian GLS approach for regional statistical analysis of a shape parameter and demonstrate that regional skew models can be relatively precise with effective record lengths in excess of 60 years. With Bayesian GLS the marginal posterior distribution of the model error variance and the corresponding mean and variance of the parameters can be computed directly, thereby providing a simple but important extension of the regional GLS regression procedures popularized by Tasker and Stedinger (1989), which is sensitive to the likely values of the model error variance when it is small relative to the sampling error in the at-site estimator.
A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines
2011-01-01
Background Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. Results To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). Conclusions PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples. PMID:21352538
A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines.
Cieślik, Marcin; Mura, Cameron
2011-02-25
Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts. To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (e.g., for biomolecular sequences, alignments, structures) and functionality (e.g., to parse/write standard file formats). PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at http://muralab.org/PaPy, and includes extensive documentation and annotated usage examples.
Computational analyses in cognitive neuroscience: in defense of biological implausibility.
Dror, I E; Gallogly, D P
1999-06-01
Because cognitive neuroscience researchers attempt to understand the human mind by bridging behavior and brain, they expect computational analyses to be biologically plausible. In this paper, biologically implausible computational analyses are shown to have critical and essential roles in the various stages and domains of cognitive neuroscience research. Specifically, biologically implausible computational analyses can contribute to (1) understanding and characterizing the problem that is being studied, (2) examining the availability of information and its representation, and (3) evaluating and understanding the neuronal solution. In the context of the distinct types of contributions made by certain computational analyses, the biological plausibility of those analyses is altogether irrelevant. These biologically implausible models are nevertheless relevant and important for biologically driven research.
ERIC Educational Resources Information Center
Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor
2014-01-01
A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…
Concurrent design of composite materials and structures considering thermal conductivity constraints
NASA Astrophysics Data System (ADS)
Jia, J.; Cheng, W.; Long, K.
2017-08-01
This article introduces thermal conductivity constraints into concurrent design. The influence of thermal conductivity on macrostructure and orthotropic composite material is extensively investigated using the minimum mean compliance as the objective function. To simultaneously control the amounts of different phase materials, a given mass fraction is applied in the optimization algorithm. Two phase materials are assumed to compete with each other to be distributed during the process of maximizing stiffness and thermal conductivity when the mass fraction constraint is small, where phase 1 has superior stiffness and thermal conductivity whereas phase 2 has a superior ratio of stiffness to density. The effective properties of the material microstructure are computed by a numerical homogenization technique, in which the effective elasticity matrix is applied to macrostructural analyses and the effective thermal conductivity matrix is applied to the thermal conductivity constraint. To validate the effectiveness of the proposed optimization algorithm, several three-dimensional illustrative examples are provided and the features under different boundary conditions are analysed.
Taimoory, S Maryamdokht; Sadraei, S Iraj; Fayoumi, Rose Anne; Nasri, Sarah; Revington, Matthew; Trant, John F
2018-04-20
The reaction between furans and maleimides has increasingly become a method of interest as its reversibility makes it a useful tool for applications ranging from self-healing materials, to self-immolative polymers, to hydrogels for cell culture and for the preparation of bone repair. However, most of these applications have relied on simple monosubstituted furans and simple maleimides and have not extensively evaluated the potential thermal variability inherent in the process that is achievable through simple substrate modification. A small library of cycloadducts suitable for the above applications was prepared, and the temperature dependence of the retro-Diels-Alder processes was determined through in situ 1 H NMR analyses complemented by computational calculations. The practical range of the reported systems ranges from 40 to >110 °C. The cycloreversion reactions are more complex than would be expected based on simple trends expected based on frontier molecular orbital analyses of the materials.
BacillOndex: an integrated data resource for systems and synthetic biology.
Misirli, Goksel; Wipat, Anil; Mullen, Joseph; James, Katherine; Pocock, Matthew; Smith, Wendy; Allenby, Nick; Hallinan, Jennifer S
2013-04-10
BacillOndex is an extension of the Ondex data integration system, providing a semantically annotated, integrated knowledge base for the model Gram-positive bacterium Bacillus subtilis. This application allows a user to mine a variety of B. subtilis data sources, and analyse the resulting integrated dataset, which contains data about genes, gene products and their interactions. The data can be analysed either manually, by browsing using Ondex, or computationally via a Web services interface. We describe the process of creating a BacillOndex instance, and describe the use of the system for the analysis of single nucleotide polymorphisms in B. subtilis Marburg. The Marburg strain is the progenitor of the widely-used laboratory strain B. subtilis 168. We identified 27 SNPs with predictable phenotypic effects, including genetic traits for known phenotypes. We conclude that BacillOndex is a valuable tool for the systems-level investigation of, and hypothesis generation about, this important biotechnology workhorse. Such understanding contributes to our ability to construct synthetic genetic circuits in this organism.
BacillOndex: An Integrated Data Resource for Systems and Synthetic Biology.
Misirli, Goksel; Wipat, Anil; Mullen, Joseph; James, Katherine; Pocock, Matthew; Smith, Wendy; Allenby, Nick; Hallinan, Jennifer S
2013-06-01
BacillOndex is an extension of the Ondex data integration system, providing a semantically annotated, integrated knowledge base for the model Gram-positive bacterium Bacillus subtilis. This application allows a user to mine a variety of B. subtilis data sources, and analyse the resulting integrated dataset, which contains data about genes, gene products and their interactions. The data can be analysed either manually, by browsing using Ondex, or computationally via a Web services interface. We describe the process of creating a BacillOndex instance, and describe the use of the system for the analysis of single nucleotide polymorphisms in B. subtilis Marburg. The Marburg strain is the progenitor of the widely-used laboratory strain B. subtilis 168. We identified 27 SNPs with predictable phenotypic effects, including genetic traits for known phenotypes. We conclude that BacillOndex is a valuable tool for the systems-level investigation of, and hypothesis generation about, this important biotechnology workhorse. Such understanding contributes to our ability to construct synthetic genetic circuits in this organism.
NASA Astrophysics Data System (ADS)
Kumar, K. Ravi; Cheepu, Muralimohan; Srinivas, B.; Venkateswarlu, D.; Pramod Kumar, G.; Shiva, Apireddi
2018-03-01
In solar air heater, artificial roughness on absorber plate become prominent technique to improving heat transfer rate of air flowing passage as a result of laminar sublayer. The selection of rib geometries plays important role on friction characteristics and heat transfer rate. Many researchers studying the roughness shapes over the years to investigate the effect of geometries on the performance of friction factor and heat transfer of the solar air heater. The present study made an attempt to develop the different rib shapes utilised for creating artificial rib roughness and its comparison to investigate higher performance of the geometries. The use of computational fluid dynamics software resulted in correlation of friction factor and heat transfer rate. The simulations studies were performed on 2D computational fluid dynamics model and analysed to identify the most effective parameters of relative roughness of the height, width and pitch on major considerations of friction factor and heat transfer. The Reynolds number is varied in a range from 3000 to 20000, in the current study and modelling has conducted on heat transfer and turbulence phenomena by using Reynolds number. The modelling results showed the formation of strong vortex in the main stream flow due to the right angle triangle roughness over the square, rectangle, improved rectangle and equilateral triangle geometries enhanced the heat transfer extension in the solar air heater. The simulation of the turbulence kinetic energy of the geometry suggests the local turbulence kinetic energy has been influenced strongly by the alignments of the right angle triangle.
Zaka, Mehreen; Sehgal, Sheikh Arslan; Shafique, Shagufta; Abbasi, Bilal Haider
2017-06-01
From last decade, there has been progressive improvement in computational drug designing. Several diseases are being cured from different plant extracts and products. Rheumatoid Arthritis (RA) is the most shared disease among auto-inflammatory diseases. Tumour necrosis factor (TNF)-α is associated with RA pathway and has adverse effects. Extensive literature review showed that plant species under study (Cannabis sativa, Prunella vulgaris and Withania somnifera) possess anti-inflammatory, anti-arthritic and anti-rheumatic properties. 13 anti-inflammatory compounds were characterised and filtered out from medicinal plant species and analysed for RA by targeting TNF-α through in silico analyses. By using ligand based pharmacophore generation approach and virtual screening against natural products libraries we retrieved twenty unique molecules that displayed utmost binding affinity, least binding energies and effective drug properties. The docking analyses revealed that Ala-22, Glu-23, Ser-65, Gln-67, Tyr-141, Leu-142, Asp-143, Phe-144 and Ala-145 were critical interacting residues for receptor-ligand interactions. It is proposed that the RA patients should use reported compounds for the prescription of RA by targeting TNF-α. This report is opening new dimensions for designing innovative therapeutic targets to cure RA. Copyright © 2017 Elsevier Inc. All rights reserved.
Delivering integrated HAZUS-MH flood loss analyses and flood inundation maps over the Web.
Hearn, Paul P; Longenecker, Herbert E; Aguinaldo, John J; Rahav, Ami N
2013-01-01
Catastrophic flooding is responsible for more loss of life and damages to property than any other natural hazard. Recently developed flood inundation mapping technologies make it possible to view the extent and depth of flooding on the land surface over the Internet; however, by themselves these technologies are unable to provide estimates of losses to property and infrastructure. The Federal Emergency Management Agency's (FEMA's) HAZUS-MH software is extensively used to conduct flood loss analyses in the United States, providing a nationwide database of population and infrastructure at risk. Unfortunately, HAZUS-MH requires a dedicated Geographic Information System (GIS) workstation and a trained operator, and analyses are not adapted for convenient delivery over the Web. This article describes a cooperative effort by the US Geological Survey (USGS) and FEMA to make HAZUS-MH output GIS and Web compatible and to integrate these data with digital flood inundation maps in USGS's newly developed Inundation Mapping Web Portal. By running the computationally intensive HAZUS-MH flood analyses offline and converting the output to a Web-GIS compatible format, detailed estimates of flood losses can now be delivered to anyone with Internet access, thus dramatically increasing the availability of these forecasts to local emergency planners and first responders.
Delivering integrated HAZUS-MH flood loss analyses and flood inundation maps over the Web
Hearn,, Paul P.; Longenecker, Herbert E.; Aguinaldo, John J.; Rahav, Ami N.
2013-01-01
Catastrophic flooding is responsible for more loss of life and damages to property than any other natural hazard. Recently developed flood inundation mapping technologies make it possible to view the extent and depth of flooding on the land surface over the Internet; however, by themselves these technologies are unable to provide estimates of losses to property and infrastructure. The Federal Emergency Management Agency’s (FEMA's) HAZUS-MH software is extensively used to conduct flood loss analyses in the United States, providing a nationwide database of population and infrastructure at risk. Unfortunately, HAZUS-MH requires a dedicated Geographic Information System (GIS) workstation and a trained operator, and analyses are not adapted for convenient delivery over the Web. This article describes a cooperative effort by the US Geological Survey (USGS) and FEMA to make HAZUS-MH output GIS and Web compatible and to integrate these data with digital flood inundation maps in USGS’s newly developed Inundation Mapping Web Portal. By running the computationally intensive HAZUS-MH flood analyses offline and converting the output to a Web-GIS compatible format, detailed estimates of flood losses can now be delivered to anyone with Internet access, thus dramatically increasing the availability of these forecasts to local emergency planners and first responders.
Ensembl comparative genomics resources.
Herrero, Javier; Muffato, Matthieu; Beal, Kathryn; Fitzgerald, Stephen; Gordon, Leo; Pignatelli, Miguel; Vilella, Albert J; Searle, Stephen M J; Amode, Ridwan; Brent, Simon; Spooner, William; Kulesha, Eugene; Yates, Andrew; Flicek, Paul
2016-01-01
Evolution provides the unifying framework with which to understand biology. The coherent investigation of genic and genomic data often requires comparative genomics analyses based on whole-genome alignments, sets of homologous genes and other relevant datasets in order to evaluate and answer evolutionary-related questions. However, the complexity and computational requirements of producing such data are substantial: this has led to only a small number of reference resources that are used for most comparative analyses. The Ensembl comparative genomics resources are one such reference set that facilitates comprehensive and reproducible analysis of chordate genome data. Ensembl computes pairwise and multiple whole-genome alignments from which large-scale synteny, per-base conservation scores and constrained elements are obtained. Gene alignments are used to define Ensembl Protein Families, GeneTrees and homologies for both protein-coding and non-coding RNA genes. These resources are updated frequently and have a consistent informatics infrastructure and data presentation across all supported species. Specialized web-based visualizations are also available including synteny displays, collapsible gene tree plots, a gene family locator and different alignment views. The Ensembl comparative genomics infrastructure is extensively reused for the analysis of non-vertebrate species by other projects including Ensembl Genomes and Gramene and much of the information here is relevant to these projects. The consistency of the annotation across species and the focus on vertebrates makes Ensembl an ideal system to perform and support vertebrate comparative genomic analyses. We use robust software and pipelines to produce reference comparative data and make it freely available. Database URL: http://www.ensembl.org. © The Author(s) 2016. Published by Oxford University Press.
Ensembl comparative genomics resources
Muffato, Matthieu; Beal, Kathryn; Fitzgerald, Stephen; Gordon, Leo; Pignatelli, Miguel; Vilella, Albert J.; Searle, Stephen M. J.; Amode, Ridwan; Brent, Simon; Spooner, William; Kulesha, Eugene; Yates, Andrew; Flicek, Paul
2016-01-01
Evolution provides the unifying framework with which to understand biology. The coherent investigation of genic and genomic data often requires comparative genomics analyses based on whole-genome alignments, sets of homologous genes and other relevant datasets in order to evaluate and answer evolutionary-related questions. However, the complexity and computational requirements of producing such data are substantial: this has led to only a small number of reference resources that are used for most comparative analyses. The Ensembl comparative genomics resources are one such reference set that facilitates comprehensive and reproducible analysis of chordate genome data. Ensembl computes pairwise and multiple whole-genome alignments from which large-scale synteny, per-base conservation scores and constrained elements are obtained. Gene alignments are used to define Ensembl Protein Families, GeneTrees and homologies for both protein-coding and non-coding RNA genes. These resources are updated frequently and have a consistent informatics infrastructure and data presentation across all supported species. Specialized web-based visualizations are also available including synteny displays, collapsible gene tree plots, a gene family locator and different alignment views. The Ensembl comparative genomics infrastructure is extensively reused for the analysis of non-vertebrate species by other projects including Ensembl Genomes and Gramene and much of the information here is relevant to these projects. The consistency of the annotation across species and the focus on vertebrates makes Ensembl an ideal system to perform and support vertebrate comparative genomic analyses. We use robust software and pipelines to produce reference comparative data and make it freely available. Database URL: http://www.ensembl.org. PMID:26896847
Coyle, Kathryn; Carrier, Marc; Lazo-Langner, Alejandro; Shivakumar, Sudeep; Zarychanski, Ryan; Tagalakis, Vicky; Solymoss, Susan; Routhier, Nathalie; Douketis, James; Coyle, Douglas
2017-03-01
Unprovoked venous thromboembolism (VTE) can be the first manifestation of cancer. It is unclear if extensive screening for occult cancer including a comprehensive computed tomography (CT) scan of the abdomen/pelvis is cost-effective in this patient population. To assess the health care related costs, number of missed cancer cases and health related utility values of a limited screening strategy with and without the addition of a comprehensive CT scan of the abdomen/pelvis and to identify to what extent testing should be done in these circumstances to allow early detection of occult cancers. Cost effectiveness analysis using data that was collected alongside the SOME randomized controlled trial which compared an extensive occult cancer screening including a CT of the abdomen/pelvis to a more limited screening strategy in patients with a first unprovoked VTE, was used for the current analyses. Analyses were conducted with a one-year time horizon from a Canadian health care perspective. Primary analysis was based on complete cases, with sensitivity analysis using appropriate multiple imputation methods to account for missing data. Data from a total of 854 patients with a first unprovoked VTE were included in these analyses. The addition of a comprehensive CT scan was associated with higher costs ($551 CDN) with no improvement in utility values or number of missed cancers. Results were consistent when adopting multiple imputation methods. The addition of a comprehensive CT scan of the abdomen/pelvis for the screening of occult cancer in patients with unprovoked VTE is not cost effective, as it is both more costly and not more effective in detecting occult cancer. Copyright © 2017 Elsevier Ltd. All rights reserved.
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
ANTLR Tree Grammar Generator and Extensions
NASA Technical Reports Server (NTRS)
Craymer, Loring
2005-01-01
A computer program implements two extensions of ANTLR (Another Tool for Language Recognition), which is a set of software tools for translating source codes between different computing languages. ANTLR supports predicated- LL(k) lexer and parser grammars, a notation for annotating parser grammars to direct tree construction, and predicated tree grammars. [ LL(k) signifies left-right, leftmost derivation with k tokens of look-ahead, referring to certain characteristics of a grammar.] One of the extensions is a syntax for tree transformations. The other extension is the generation of tree grammars from annotated parser or input tree grammars. These extensions can simplify the process of generating source-to-source language translators and they make possible an approach, called "polyphase parsing," to translation between computing languages. The typical approach to translator development is to identify high-level semantic constructs such as "expressions," "declarations," and "definitions" as fundamental building blocks in the grammar specification used for language recognition. The polyphase approach is to lump ambiguous syntactic constructs during parsing and then disambiguate the alternatives in subsequent tree transformation passes. Polyphase parsing is believed to be useful for generating efficient recognizers for C++ and other languages that, like C++, have significant ambiguities.
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
ERIC Educational Resources Information Center
Rothberg, S. J.; Lamb, F. M.; Willis, L.
2006-01-01
This paper gives a synopsis of an extensive programme of case studies on real uses of computer-assisted learning (CAL) materials within UK engineering degree programmes. The programme was conducted between 2000 and 2003 and followed a questionnaire-based survey looking at CAL use in the UK and in Australia. The synopsis reveals a number of key…
From Greeks to Today: Cipher Trees and Computer Cryptography.
ERIC Educational Resources Information Center
Grady, M. Tim; Brumbaugh, Doug
1988-01-01
Explores the use of computers for teaching mathematical models of transposition ciphers. Illustrates the ideas, includes activities and extensions, provides a mathematical model and includes computer programs to implement these topics. (MVL)
Cloud Computing. Technology Briefing. Number 1
ERIC Educational Resources Information Center
Alberta Education, 2013
2013-01-01
Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…
Modified RS2101 rocket engine study program
NASA Technical Reports Server (NTRS)
1971-01-01
The purpose of the program is to perform design studies and analyses to determine the effects of incorporating a 60:1 expansion area ratio nozzle extension, extended firing time, and modified operating conditions and environments on the MM'71 rocket engine assembly. An injector-to-thrust chamber seal study was conducted to define potential solutions for leakage past this joint. The results and recommendations evolving from the engine thermal analyses, the injector-to-thrust chamber seal studies, and the nozzle extension joint stress analyses are presented.
Extending fields in a level set method by solving a biharmonic equation
NASA Astrophysics Data System (ADS)
Moroney, Timothy J.; Lusmore, Dylan R.; McCue, Scott W.; McElwain, D. L. Sean
2017-08-01
We present an approach for computing extensions of velocities or other fields in level set methods by solving a biharmonic equation. The approach differs from other commonly used approaches to velocity extension because it deals with the interface fully implicitly through the level set function. No explicit properties of the interface, such as its location or the velocity on the interface, are required in computing the extension. These features lead to a particularly simple implementation using either a sparse direct solver or a matrix-free conjugate gradient solver. Furthermore, we propose a fast Poisson preconditioner that can be used to accelerate the convergence of the latter. We demonstrate the biharmonic extension on a number of test problems that serve to illustrate its effectiveness at producing smooth and accurate extensions near interfaces. A further feature of the method is the natural way in which it deals with symmetry and periodicity, ensuring through its construction that the extension field also respects these symmetries.
Vector wind profile gust model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1979-01-01
Work towards establishing a vector wind profile gust model for the Space Transportation System flight operations and trade studies is reported. To date, all the statistical and computational techniques required were established and partially implemented. An analysis of wind profile gust at Cape Kennedy within the theoretical framework is presented. The variability of theoretical and observed gust magnitude with filter type, altitude, and season is described. Various examples are presented which illustrate agreement between theoretical and observed gust percentiles. The preliminary analysis of the gust data indicates a strong variability with altitude, season, and wavelength regime. An extension of the analyses to include conditional distributions of gust magnitude given gust length, distributions of gust modulus, and phase differences between gust components has begun.
Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation
Biggs, Matthew B.; Papin, Jason A.
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108
Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.
Biggs, Matthew B; Papin, Jason A
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.
Shieh, Gwowen
2010-05-28
Due to its extensive applicability and computational ease, moderated multiple regression (MMR) has been widely employed to analyze interaction effects between 2 continuous predictor variables. Accordingly, considerable attention has been drawn toward the supposed multicollinearity problem between predictor variables and their cross-product term. This article attempts to clarify the misconception of multicollinearity in MMR studies. The counterintuitive yet beneficial effects of multicollinearity on the ability to detect moderator relationships are explored. Comprehensive treatments and numerical investigations are presented for the simplest interaction model and more complex three-predictor setting. The results provide critical insight that both helps avoid misleading interpretations and yields better understanding for the impact of intercorrelation among predictor variables in MMR analyses.
Exactly solvable random graph ensemble with extensively many short cycles
NASA Astrophysics Data System (ADS)
Aguirre López, Fabián; Barucca, Paolo; Fekom, Mathilde; Coolen, Anthony C. C.
2018-02-01
We introduce and analyse ensembles of 2-regular random graphs with a tuneable distribution of short cycles. The phenomenology of these graphs depends critically on the scaling of the ensembles’ control parameters relative to the number of nodes. A phase diagram is presented, showing a second order phase transition from a connected to a disconnected phase. We study both the canonical formulation, where the size is large but fixed, and the grand canonical formulation, where the size is sampled from a discrete distribution, and show their equivalence in the thermodynamical limit. We also compute analytically the spectral density, which consists of a discrete set of isolated eigenvalues, representing short cycles, and a continuous part, representing cycles of diverging size.
Saeed, Isaam; Wong, Stephen Q.; Mar, Victoria; Goode, David L.; Caramia, Franco; Doig, Ken; Ryland, Georgina L.; Thompson, Ella R.; Hunter, Sally M.; Halgamuge, Saman K.; Ellul, Jason; Dobrovic, Alexander; Campbell, Ian G.; Papenfuss, Anthony T.; McArthur, Grant A.; Tothill, Richard W.
2014-01-01
Targeted resequencing by massively parallel sequencing has become an effective and affordable way to survey small to large portions of the genome for genetic variation. Despite the rapid development in open source software for analysis of such data, the practical implementation of these tools through construction of sequencing analysis pipelines still remains a challenging and laborious activity, and a major hurdle for many small research and clinical laboratories. We developed TREVA (Targeted REsequencing Virtual Appliance), making pre-built pipelines immediately available as a virtual appliance. Based on virtual machine technologies, TREVA is a solution for rapid and efficient deployment of complex bioinformatics pipelines to laboratories of all sizes, enabling reproducible results. The analyses that are supported in TREVA include: somatic and germline single-nucleotide and insertion/deletion variant calling, copy number analysis, and cohort-based analyses such as pathway and significantly mutated genes analyses. TREVA is flexible and easy to use, and can be customised by Linux-based extensions if required. TREVA can also be deployed on the cloud (cloud computing), enabling instant access without investment overheads for additional hardware. TREVA is available at http://bioinformatics.petermac.org/treva/. PMID:24752294
Computational fluid dynamics analysis in support of the simplex turbopump design
NASA Technical Reports Server (NTRS)
Garcia, Roberto; Griffin, Lisa W.; Benjamin, Theodore G.; Cornelison, Joni W.; Ruf, Joseph H.; Williams, Robert W.
1994-01-01
Simplex is a turbopump that is being developed at NASA/Marshall Space Flight Center (MSFC) by an in-house team. The turbopump consists of a single-stage centrifugal impeller, vaned-diffuser pump powered by a single-stage, axial, supersonic, partial admission turbine. The turbine is driven by warm gaseous oxygen tapped off of the hybrid motor to which it will be coupled. Rolling element bearings are cooled by the pumping fluid. Details of the configuration and operating conditions are given by Marsh. CFD has been used extensively to verify one-dimensional (1D) predictions, assess aerodynamic and hydrodynamic designs, and to provide flow environments. The complete primary flow path of the pump-end and the hot gas path of the turbine, excluding the inlet torus, have been analyzed. All CFD analyses conducted for the Simplex turbopump employed the pressure based Finite Difference Navier-Stokes (FDNS) code using a standard kappa-epsilon turbulence model with wall functions. More detailed results are presented by Garcia et. al. To support the team, loading and temperature results for the turbine rotor were provided as inputs to structural and thermal analyses, and blade loadings from the inducer were provided for structural analyses.
Understanding the Flow Physics of Shock Boundary-Layer Interactions Using CFD and Numerical Analyses
NASA Technical Reports Server (NTRS)
Friedlander, David J.
2013-01-01
Computational fluid dynamic (CFD) analyses of the University of Michigan (UM) Shock/Boundary-Layer Interaction (SBLI) experiments were performed as an extension of the CFD SBLI Workshop held at the 48th AIAA Aerospace Sciences Meeting in 2010. In particular, the UM Mach 2.75 Glass Tunnel with a semi-spanning 7.75deg wedge was analyzed in attempts to explore key physics pertinent to SBLI's, including thermodynamic and viscous boundary conditions as well as turbulence modeling. Most of the analyses were 3D CFD simulations using the OVERFLOW flow solver, with additional quasi-1D simulations performed with an in house MATLAB code interfacing with the NIST REFPROP code to explore perfect verses non-ideal air. A fundamental exploration pertaining to the effects of particle image velocimetry (PIV) on post-processing data is also shown. Results from the CFD simulations showed an improvement in agreement with experimental data with key contributions including adding a laminar zone upstream of the wedge and the necessity of mimicking PIV particle lag for comparisons. Results from the quasi-1D simulation showed that there was little difference between perfect and non-ideal air for the configuration presented.
Working Towards New Transformative Geoscience Analytics Enabled by Petascale Computing
NASA Astrophysics Data System (ADS)
Woodcock, R.; Wyborn, L.
2012-04-01
Currently the top 10 supercomputers in the world are petascale and already exascale computers are being planned. Cloud computing facilities are becoming mainstream either as private or commercial investments. These computational developments will provide abundant opportunities for the earth science community to tackle the data deluge which has resulted from new instrumentation enabling data to be gathered at a greater rate and at higher resolution. Combined, the new computational environments should enable the earth sciences to be transformed. However, experience in Australia and elsewhere has shown that it is not easy to scale existing earth science methods, software and analytics to take advantage of the increased computational capacity that is now available. It is not simply a matter of 'transferring' current work practices to the new facilities: they have to be extensively 'transformed'. In particular new Geoscientific methods will need to be developed using advanced data mining, assimilation, machine learning and integration algorithms. Software will have to be capable of operating in highly parallelised environments, and will also need to be able to scale as the compute systems grow. Data access will have to improve and the earth science community needs to move from the file discovery, display and then locally download paradigm to self describing data cubes and data arrays that are available as online resources from either major data repositories or in the cloud. In the new transformed world, rather than analysing satellite data scene by scene, sensor agnostic data cubes of calibrated earth observation data will enable researchers to move across data from multiple sensors at varying spatial data resolutions. In using geophysics to characterise basement and cover, rather than analysing individual gridded airborne geophysical data sets, and then combining the results, petascale computing will enable analysis of multiple data types, collected at varying resolutions with integration and validation across data type boundaries. Increased capacity of storage and compute will mean that uncertainty and reliability of individual observations will consistently be taken into account and propagated throughout the processing chain. If these data access difficulties can be overcome, the increased compute capacity will also mean that larger scale, more complex models can be run at higher resolution and instead of single pass modelling runs. Ensembles of models will be able to be run to simultaneously test multiple hypotheses. Petascale computing and high performance data offer more than "bigger, faster": it is an opportunity for a transformative change in the way in which geoscience research is routinely conducted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Itagaki, Masafumi; Miyoshi, Yoshinori; Hirose, Hideyuki
A procedure is presented for the determination of geometric buckling for regular polygons. A new computation technique, the multiple reciprocity boundary element method (MRBEM), has been applied to solve the one-group neutron diffusion equation. The main difficulty in applying the ordinary boundary element method (BEM) to neutron diffusion problems has been the need to compute a domain integral, resulting from the fission source. The MRBEM has been developed for transforming this type of domain integral into an equivalent boundary integral. The basic idea of the MRBEM is to apply repeatedly the reciprocity theorem (Green's second formula) using a sequence ofmore » higher order fundamental solutions. The MRBEM requires discretization of the boundary only rather than of the domain. This advantage is useful for extensive survey analyses of buckling for complex geometries. The results of survey analyses have indicated that the general form of geometric buckling is B[sub g][sup 2] = (a[sub n]/R[sub c])[sup 2], where R[sub c] represents the radius of the circumscribed circle of the regular polygon under consideration. The geometric constant A[sub n] depends on the type of regular polygon and takes the value of [pi] for a square and 2.405 for a circle, an extreme case that has an infinite number of sides. Values of a[sub n] for a triangle, pentagon, hexagon, and octagon have been calculated as 4.190, 2.281, 2.675, and 2.547, respectively.« less
NASA Technical Reports Server (NTRS)
Jordan, Harry F.; Benten, Muhammad S.; Arenstorf, Norbert S.; Ramanan, Aruna V.
1987-01-01
A methodology for writing parallel programs for shared memory multiprocessors has been formalized as an extension to the Fortran language and implemented as a macro preprocessor. The extended language is known as the Force, and this manual describes how to write Force programs and execute them on the Flexible Computer Corporation Flex/32, the Encore Multimax and the Sequent Balance computers. The parallel extension macros are described in detail, but knowledge of Fortran is assumed.
Cloud Based Educational Systems and Its Challenges and Opportunities and Issues
ERIC Educational Resources Information Center
Paul, Prantosh Kr.; Lata Dangwal, Kiran
2014-01-01
Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…
Individual-based analyses reveal limited functional overlap in a coral reef fish community.
Brandl, Simon J; Bellwood, David R
2014-05-01
Detailed knowledge of a species' functional niche is crucial for the study of ecological communities and processes. The extent of niche overlap, functional redundancy and functional complementarity is of particular importance if we are to understand ecosystem processes and their vulnerability to disturbances. Coral reefs are among the most threatened marine systems, and anthropogenic activity is changing the functional composition of reefs. The loss of herbivorous fishes is particularly concerning as the removal of algae is crucial for the growth and survival of corals. Yet, the foraging patterns of the various herbivorous fish species are poorly understood. Using a multidimensional framework, we present novel individual-based analyses of species' realized functional niches, which we apply to a herbivorous coral reef fish community. In calculating niche volumes for 21 species, based on their microhabitat utilization patterns during foraging, and computing functional overlaps, we provide a measurement of functional redundancy or complementarity. Complementarity is the inverse of redundancy and is defined as less than 50% overlap in niche volumes. The analyses reveal extensive complementarity with an average functional overlap of just 15.2%. Furthermore, the analyses divide herbivorous reef fishes into two broad groups. The first group (predominantly surgeonfishes and parrotfishes) comprises species feeding on exposed surfaces and predominantly open reef matrix or sandy substrata, resulting in small niche volumes and extensive complementarity. In contrast, the second group consists of species (predominantly rabbitfishes) that feed over a wider range of microhabitats, penetrating the reef matrix to exploit concealed surfaces of various substratum types. These species show high variation among individuals, leading to large niche volumes, more overlap and less complementarity. These results may have crucial consequences for our understanding of herbivorous processes on coral reefs, as algal removal appears to depend strongly on species-specific microhabitat utilization patterns of herbivores. Furthermore, the results emphasize the capacity of the individual-based analyses to reveal variation in the functional niches of species, even in high-diversity systems such as coral reefs, demonstrating its potential applicability to other high-diversity ecosystems. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Notebook computer use on a desk, lap and lap support: effects on posture, performance and comfort.
Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T
2010-01-01
This study quantified postures of users working on a notebook computer situated in their lap and tested the effect of using a device designed to increase the height of the notebook when placed on the lap. A motion analysis system measured head, neck and upper extremity postures of 15 adults as they worked on a notebook computer placed on a desk (DESK), the lap (LAP) and a commercially available lapdesk (LAPDESK). Compared with the DESK, the LAP increased downwards head tilt 6 degrees and wrist extension 8 degrees . Shoulder flexion and ulnar deviation decreased 13 degrees and 9 degrees , respectively. Compared with the LAP, the LAPDESK decreased downwards head tilt 4 degrees , neck flexion 2 degrees , and wrist extension 9 degrees. Users reported less discomfort and difficulty in the DESK configuration. Use of the lapdesk improved postures compared with the lap; however, all configurations resulted in high values of wrist extension, wrist deviation and downwards head tilt. STATEMENT OF RELEVANCE: This study quantifies postures of users working with a notebook computer in typical portable configurations. A better understanding of the postures assumed during notebook computer use can improve usage guidelines to reduce the risk of musculoskeletal injuries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonachea, D.; Dickens, P.; Thakur, R.
There is a growing interest in using Java as the language for developing high-performance computing applications. To be successful in the high-performance computing domain, however, Java must not only be able to provide high computational performance, but also high-performance I/O. In this paper, we first examine several approaches that attempt to provide high-performance I/O in Java - many of which are not obvious at first glance - and evaluate their performance on two parallel machines, the IBM SP and the SGI Origin2000. We then propose extensions to the Java I/O library that address the deficiencies in the Java I/O APImore » and improve performance dramatically. The extensions add bulk (array) I/O operations to Java, thereby removing much of the overhead currently associated with array I/O in Java. We have implemented the extensions in two ways: in a standard JVM using the Java Native Interface (JNI) and in a high-performance parallel dialect of Java called Titanium. We describe the two implementations and present performance results that demonstrate the benefits of the proposed extensions.« less
Sausedo, R A; Schoenwolf, G C
1994-05-01
Formation and extension of the notochord (i.e., notogenesis) is one of the earliest and most obvious events of axis development in vertebrate embryos. In birds and mammals, prospective notochord cells arise from Hensen's node and come to lie beneath the midline of the neural plate. Throughout the period of neurulation, the notochord retains its close spatial relationship with the developing neural tube and undergoes rapid extension in concert with the overlying neuroepithelium. In the present study, we examined notochord development quantitatively in mouse embryos. C57BL/6 mouse embryos were collected at 8, 8.5, 9, 9.5, and 10 days of gestation. They were then embedded in paraffin and sectioned transversely. Serial sections from 21 embryos were stained with Schiff's reagent according to the Feulgen-Rossenbeck procedure and used for quantitative analyses of notochord extension. Quantitative analyses revealed that extension of the notochord involves cell division within the notochord proper and cell rearrangement within the notochordal plate (the immediate precursor of the notochord). In addition, extension of the notochord involves cell accretion, that is, the addition of cells to the notochord's caudal end, a process that involves considerable cell rearrangement at the notochordal plate-node interface. Extension of the mouse notochord occurs similarly to that described previously for birds (Sausedo and Schoenwolf, 1993 Anat. Rec. 237:58-70). That is, in both birds (i.e., quail and chick) and mouse embryos, notochord extension involves cell division, cell rearrangement, and cell accretion. Thus higher vertebrates utilize similar morphogenetic movements to effect notogenesis.
Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike
2012-01-01
There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.
2009-04-01
An Extensive X-ray Computed Tomography Evaluation of a Fully Penetrated Encapsulated SiC MMC Ballistic Panel by William H. Green and Robert H...Panel William H. Green and Robert H. Carter Weapons and Materials Research Directorate, ARL...PROGRAM ELEMENT NUMBER 2182040 6. AUTHOR(S) William H. Green and Robert H. Carter 5d. PROJECT NUMBER AH80 5e. TASK NUMBER 5f. WORK UNIT
Vectorization with SIMD extensions speeds up reconstruction in electron tomography.
Agulleiro, J I; Garzón, E M; García, I; Fernández, J J
2010-06-01
Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.
Technical Development and Application of Soft Computing in Agricultural and Biological Engineering
USDA-ARS?s Scientific Manuscript database
Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...
Development of Soft Computing and Applications in Agricultural and Biological Engineering
USDA-ARS?s Scientific Manuscript database
Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...
DOT National Transportation Integrated Search
1975-02-01
A methodology and a computer program, DYNALIST II, have been developed for computing the response of rail vehicle systems to sinusoidal or stationary random rail irregularities. The computer program represents an extension of the earlier DYNALIST pro...
DOT National Transportation Integrated Search
1975-02-01
A methodology and a computer program, DYNALIST II, have been developed for computing the response of rail vehicle systems to sinusoidal or stationary random rail irregularities. The computer program represents an extension of the earlier DYNALIST pro...
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Design and Construction of a Thermal Contact Resistance and Thermal Conductivity Measurement System
2015-09-01
plate interface resistance control. Numerical heat transfer and uncertainty analyses with applied engineering judgement were extensively used to come... heat transfer issues facing the Department of Defense. 14. SUBJECT TERMS Thermal contact resistance, thermal conductivity, measurement system 15... heat transfer and uncertainty analyses with applied engineering judgement were extensively used to come up with an optimized design and construction
Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V
2014-05-07
ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required.
WEBnm@ v2.0: Web server and services for comparing protein flexibility.
Tiwari, Sandhya P; Fuglebakk, Edvin; Hollup, Siv M; Skjærven, Lars; Cragnolini, Tristan; Grindhaug, Svenn H; Tekle, Kidane M; Reuter, Nathalie
2014-12-30
Normal mode analysis (NMA) using elastic network models is a reliable and cost-effective computational method to characterise protein flexibility and by extension, their dynamics. Further insight into the dynamics-function relationship can be gained by comparing protein motions between protein homologs and functional classifications. This can be achieved by comparing normal modes obtained from sets of evolutionary related proteins. We have developed an automated tool for comparative NMA of a set of pre-aligned protein structures. The user can submit a sequence alignment in the FASTA format and the corresponding coordinate files in the Protein Data Bank (PDB) format. The computed normalised squared atomic fluctuations and atomic deformation energies of the submitted structures can be easily compared on graphs provided by the web user interface. The web server provides pairwise comparison of the dynamics of all proteins included in the submitted set using two measures: the Root Mean Squared Inner Product and the Bhattacharyya Coefficient. The Comparative Analysis has been implemented on our web server for NMA, WEBnm@, which also provides recently upgraded functionality for NMA of single protein structures. This includes new visualisations of protein motion, visualisation of inter-residue correlations and the analysis of conformational change using the overlap analysis. In addition, programmatic access to WEBnm@ is now available through a SOAP-based web service. Webnm@ is available at http://apps.cbu.uib.no/webnma . WEBnm@ v2.0 is an online tool offering unique capability for comparative NMA on multiple protein structures. Along with a convenient web interface, powerful computing resources, and several methods for mode analyses, WEBnm@ facilitates the assessment of protein flexibility within protein families and superfamilies. These analyses can give a good view of how the structures move and how the flexibility is conserved over the different structures.
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Lenis, Vasileios Panagiotis E; Swain, Martin; Larkin, Denis M
2018-05-01
Cross-species whole-genome sequence alignment is a critical first step for genome comparative analyses, ranging from the detection of sequence variants to studies of chromosome evolution. Animal genomes are large and complex, and whole-genome alignment is a computationally intense process, requiring expensive high-performance computing systems due to the need to explore extensive local alignments. With hundreds of sequenced animal genomes available from multiple projects, there is an increasing demand for genome comparative analyses. Here, we introduce G-Anchor, a new, fast, and efficient pipeline that uses a strictly limited but highly effective set of local sequence alignments to anchor (or map) an animal genome to another species' reference genome. G-Anchor makes novel use of a databank of highly conserved DNA sequence elements. We demonstrate how these elements may be aligned to a pair of genomes, creating anchors. These anchors enable the rapid mapping of scaffolds from a de novo assembled genome to chromosome assemblies of a reference species. Our results demonstrate that G-Anchor can successfully anchor a vertebrate genome onto a phylogenetically related reference species genome using a desktop or laptop computer within a few hours and with comparable accuracy to that achieved by a highly accurate whole-genome alignment tool such as LASTZ. G-Anchor thus makes whole-genome comparisons accessible to researchers with limited computational resources. G-Anchor is a ready-to-use tool for anchoring a pair of vertebrate genomes. It may be used with large genomes that contain a significant fraction of evolutionally conserved DNA sequences and that are not highly repetitive, polypoid, or excessively fragmented. G-Anchor is not a substitute for whole-genome aligning software but can be used for fast and accurate initial genome comparisons. G-Anchor is freely available and a ready-to-use tool for the pairwise comparison of two genomes.
NASA Astrophysics Data System (ADS)
Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.
2016-12-01
A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.
Errors in finite-difference computations on curvilinear coordinate systems
NASA Technical Reports Server (NTRS)
Mastin, C. W.; Thompson, J. F.
1980-01-01
Curvilinear coordinate systems were used extensively to solve partial differential equations on arbitrary regions. An analysis of truncation error in the computation of derivatives revealed why numerical results may be erroneous. A more accurate method of computing derivatives is presented.
Assessment of technical condition of concrete pavement by the example of district road
NASA Astrophysics Data System (ADS)
Linek, M.; Nita, P.; Żebrowski, W.; Wolka, P.
2018-05-01
The article presents the comprehensive assessment of concrete pavement condition. Analyses included the district road located in the swietokrzyskie province, used for 11 years. Comparative analyses were conducted twice. The first analysis was carried out after 9 years of pavement operation, in 2015. In order to assess the extent of pavement degradation, the tests were repeated in 2017. Within the scope of field research, the traffic intensity within the analysed road section was determined. Visual assessment of pavement condition was conducted, according to the guidelines included in SOSN-B. Visual assessment can be extended by ground-penetrating radar measurements which allow to provide comprehensive assessment of the occurred structure changes within its entire thickness and length. The assessment included also performance parameters, i.e. pavement regularity, surface roughness and texture. Extension of test results by the assessment of changes in internal structure of concrete composite and structure observations by means of Scanning Electron Microscope allow for the assessment of parameters of internal structure of hardened concrete. Supplementing the observations of internal structure by means of computed tomography scan provides comprehensive information of possible discontinuities and composite structure. According to the analysis of the obtained results, conclusions concerning the analysed pavement condition were reached. It was determined that the pavement is distinguished by high performance parameters, its condition is good and it does not require any repairs. Maintenance treatment was suggested in order to extend the period of proper operation of the analysed pavement.
NASA Technical Reports Server (NTRS)
Fleming, David P.
2001-01-01
Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.
AIP1OGREN: Aerosol Observing Station Intensive Properties Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koontz, Annette; Flynn, Connor
The aip1ogren value-added product (VAP) computes several aerosol intensive properties. It requires as input calibrated, corrected, aerosol extensive properties (scattering and absorption coefficients, primarily) from the Aerosol Observing Station (AOS). Aerosol extensive properties depend on both the nature of the aerosol and the amount of the aerosol. We compute several properties as relationships between the various extensive properties. These intensive properties are independent of aerosol amount and instead relate to intrinsic properties of the aerosol itself. Along with the original extensive properties we report aerosol single-scattering albedo, hemispheric backscatter fraction, asymmetry parameter, and Ångström exponent for scattering and absorption withmore » one-minute averaging. An hourly averaged file is produced from the 1-minute files that includes all extensive and intensive properties as well as submicron scattering and submicron absorption fractions. Finally, in both the minutely and hourly files the aerosol radiative forcing efficiency is provided.« less
Numerical simulation of synthesis gas incineration
NASA Astrophysics Data System (ADS)
Kazakov, A. V.; Khaustov, S. A.; Tabakaev, R. B.; Belousova, Y. A.
2016-04-01
The authors have analysed the expediency of the suggested low-grade fuels application method. Thermal processing of solid raw materials in the gaseous fuel, called synthesis gas, is investigated. The technical challenges concerning the applicability of the existing gas equipment developed and extensively tested exclusively for natural gas were considered. For this purpose computer simulation of three-dimensional syngas-incinerating flame dynamics was performed by means of the ANSYS Multiphysics engineering software. The subjects of studying were: a three-dimensional aerodynamic flame structure, heat-release and temperature fields, a set of combustion properties: a flare range and the concentration distribution of burnout reagents. The obtained results were presented in the form of a time-averaged pathlines with color indexing. The obtained results can be used for qualitative and quantitative evaluation of complex multicomponent gas incineration singularities.
The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment
NASA Astrophysics Data System (ADS)
Brun, R.; Duellmann, D.; Ganis, G.; Hanushevsky, A.; Janyst, L.; Peters, A. J.; Rademakers, F.; Sindrilaru, E.
2011-12-01
The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.
Computers and the Primary Curriculum 3-13.
ERIC Educational Resources Information Center
Crompton, Rob, Ed.
This book is a comprehensive and practical guide to the use of computers across a wide age range. Extensive use is made of photographs, illustrations, cartoons, and samples of children's work to demonstrate the versatility of computer use in schools. An introduction by Rob Crompton placing computer use within the educational context of the United…
Additional extensions to the NASCAP computer code, volume 1
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Katz, I.; Stannard, P. R.
1981-01-01
Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.
Extensions and improvements on XTRAN3S
NASA Technical Reports Server (NTRS)
Borland, C. J.
1989-01-01
Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.
ASPeak: an abundance sensitive peak detection algorithm for RIP-Seq.
Kucukural, Alper; Özadam, Hakan; Singh, Guramrit; Moore, Melissa J; Cenik, Can
2013-10-01
Unlike DNA, RNA abundances can vary over several orders of magnitude. Thus, identification of RNA-protein binding sites from high-throughput sequencing data presents unique challenges. Although peak identification in ChIP-Seq data has been extensively explored, there are few bioinformatics tools tailored for peak calling on analogous datasets for RNA-binding proteins. Here we describe ASPeak (abundance sensitive peak detection algorithm), an implementation of an algorithm that we previously applied to detect peaks in exon junction complex RNA immunoprecipitation in tandem experiments. Our peak detection algorithm yields stringent and robust target sets enabling sensitive motif finding and downstream functional analyses. ASPeak is implemented in Perl as a complete pipeline that takes bedGraph files as input. ASPeak implementation is freely available at https://sourceforge.net/projects/as-peak under the GNU General Public License. ASPeak can be run on a personal computer, yet is designed to be easily parallelizable. ASPeak can also run on high performance computing clusters providing efficient speedup. The documentation and user manual can be obtained from http://master.dl.sourceforge.net/project/as-peak/manual.pdf.
Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569
Modelling mid-course corrections for optimality conditions along interplanetary transfers
NASA Astrophysics Data System (ADS)
Iorfida, Elisabetta; Palmer, Phil; Roberts, Mark
2014-12-01
Within the field of trajectory optimisation, Lawden developed the primer vector theory, which defines a set of necessary conditions to characterise whether a transfer trajectory, in the two-body problem context, is optimum with respect to propellant usage. If the conditions are not satisfied, a region of the transfer trajectory is identified in which one or more potential intermediate impulses are performed in order to lower the overall cost. The method is computationally complex owing to having to solve a boundary value problem. In this paper is presented a new propagator that reduces the mathematical complexity and the computational cost of the problem, in particular it exploits a separation between the in-plane and out-of-plane components of the primer vector along the transfer trajectory. Using this propagator, the optimality of the transfer arc has been investigated, varying the departure and arrival orbits. In particular, keeping fixed the transfer trajectory, the optimality has been extensively analysed varying both the initial and final positions on the orbit, together with the directions of the initial and final thrust impulses.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
20 CFR 704.103 - Removal of certain minimums when computing or paying compensation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Removal of certain minimums when computing or... PROVISIONS FOR LHWCA EXTENSIONS Defense Base Act § 704.103 Removal of certain minimums when computing or... benefits are to be computed under section 9 of the LHWCA, 33 U.S.C. 909, shall not apply in computing...
ERIC Educational Resources Information Center
Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu
2017-01-01
In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…
Dancing Styles of Collective Cell Migration: Image-Based Computational Analysis of JRAB/MICAL-L2.
Sakane, Ayuko; Yoshizawa, Shin; Yokota, Hideo; Sasaki, Takuya
2018-01-01
Collective cell migration is observed during morphogenesis, angiogenesis, and wound healing, and this type of cell migration also contributes to efficient metastasis in some kinds of cancers. Because collectively migrating cells are much better organized than a random assemblage of individual cells, there seems to be a kind of order in migrating clusters. Extensive research has identified a large number of molecules involved in collective cell migration, and these factors have been analyzed using dramatic advances in imaging technology. To date, however, it remains unclear how myriad cells are integrated as a single unit. Recently, we observed unbalanced collective cell migrations that can be likened to either precision dancing or awa-odori , Japanese traditional dancing similar to the style at Rio Carnival, caused by the impairment of the conformational change of JRAB/MICAL-L2. This review begins with a brief history of image-based computational analyses on cell migration, explains why quantitative analysis of the stylization of collective cell behavior is difficult, and finally introduces our recent work on JRAB/MICAL-L2 as a successful example of the multidisciplinary approach combining cell biology, live imaging, and computational biology. In combination, these methods have enabled quantitative evaluations of the "dancing style" of collective cell migration.
VizieR Online Data Catalog: Historical and HST Astrometry of Sirius A,B (Bond+, 2017)
NASA Astrophysics Data System (ADS)
Bond, H. E.; Schaefer, G. H.; Gilliland, R. L.; Holberg, J. B.; Mason, B. D.; Lindenblad, I. W.; Seitz-McLeese, M.; Arnett, W. D.; Demarque, P.; Spada, F.; Young, P. A.; Barstow, M. A.; Burleigh, M. R.; Gudehus, D.
2017-05-01
We have assembled a compilation of published historical measurements of the position angle (PA) and the angular separation of Sirius B relative to Sirius A. Our tabulation is based on a critical review of measures contained in the Washington Double Star Catalog maintained at the USNO and from our additional literature searches. Notes included in the tabulation give extensive commentary on the historical observations. Many early publications provided measures averaged over multiple nights or even an entire observing season for the purpose of reducing computational labor in subsequent analyses. With modern computers, there is no need for such averaging, so we opted to present the individual measures whenever available. However, if an observer reported more than one measurement on a given night, we did compute the mean position for that night. If the original publication only reported a mean across several nights, we tabulated that mean as reported. The visual micrometer observations did not always include a contemporaneous measurement of both the PA and separation. These omissions are listed as -99.0 in the table. The measurement uncertainties were assigned through our orbital fitting method described in the paper. Measurements that were rejected from the orbital solution are identified in the Notes column and are listed with uncertainties of 0. (3 data files).
An Extension of the Mean Value Theorem for Integrals
ERIC Educational Resources Information Center
Khalili, Parviz; Vasiliu, Daniel
2010-01-01
In this note we present an extension of the mean value theorem for integrals. The extension we consider is motivated by an older result (here referred as Corollary 2), which is quite classical for the literature of Mathematical Analysis or Calculus. We also show an interesting application for computing the sum of a harmonic series.
Insights into the three-dimensional Lagrangian geometry of the Antarctic polar vortex
NASA Astrophysics Data System (ADS)
Curbelo, Jezabel; José García-Garrido, Víctor; Mechoso, Carlos Roberto; Mancho, Ana Maria; Wiggins, Stephen; Niang, Coumba
2017-07-01
In this paper we study the three-dimensional (3-D) Lagrangian structures in the stratospheric polar vortex (SPV) above Antarctica. We analyse and visualize these structures using Lagrangian descriptor function M. The procedure for calculation with reanalysis data is explained. Benchmarks are computed and analysed that allow us to compare 2-D and 3-D aspects of Lagrangian transport. Dynamical systems concepts appropriate to 3-D, such as normally hyperbolic invariant curves, are discussed and applied. In order to illustrate our approach we select an interval of time in which the SPV is relatively undisturbed (August 1979) and an interval of rapid SPV changes (October 1979). Our results provide new insights into the Lagrangian structure of the vertical extension of the stratospheric polar vortex and its evolution. Our results also show complex Lagrangian patterns indicative of strong mixing processes in the upper troposphere and lower stratosphere. Finally, during the transition to summer in the late spring, we illustrate the vertical structure of two counterrotating vortices, one the polar and the other an emerging one, and the invariant separatrix that divides them.
Digital analytical data from mineral resource assessments of national forest lands in Washington
Boleneus, D.E.; Chase, D.W.
1999-01-01
Extensive reconnaissance assessments of the mineral resource potential of the Colville and Okanogan National Forests in northeastern Washington were conducted during 1979-1982 by a private consultant A.R. Grant, under contract with the U.S. Department of Agriculture, Forest Service. These forests occupy large parts of Pend Oreille, Stevens, Ferry, and Okanogan counties, and smaller parts of Whatcom, Skagit, and Chelan counties adjoining Okanogan County in the Cascades. Sampled terrain also included the Kaniksu National Forest in Pend Oreille County and one stream bed of the Kaniksu in adjacent Bonner County, Idaho. Two unpublished reports resulting from the assessments (Grant, 1982a,b) list a total of 3,927 analyses of gold, silver, copper, lead, zinc, molybdenum, tungsten, and uranium content of stream sediment and bedrock samples collected at widely dispersed sites in the three National Forests. This report makes this important body of work available in digital form on diskettes, to enhance manipulations with computer spreadsheets, geographic information systems (GIS), and digital spatial analyses. This will allow for utilization of data by modern day explorationists and by the general geodata user community.
Crosby, Richard A.; Mena, Leandro; Ricks, JaNelle
2018-01-01
This study applied an 8-item index of recent sexual risk behaviors to young Black men who have sex with men (YBMSM) and evaluated the distribution for normality. The distribution was tested for associations with possible antecedents of sexual risk. YBMSM (N=600), ages 16–29 years, were recruited from an STI clinic, located in the Southern United States. Men completed an extensive audio-computer assisted self-interview. Thirteen possible antecedents of sexual risk, as assessed by the index, were selected for analyses. The 8-item index formed a normal distribution with a mean of 4.77 (sd=1.77). In adjusted analyses, not having completed education beyond high school was associated with less risk, as was having sex with females. Conversely, meeting sex partners online was associated with greater risk, as was reporting that sex partners were drunk during sex. The obtained normal distribution of sexual risk behaviors suggests a corresponding need to “target and tailor” clinic-based counseling and prevention services for YBMSM. Avoiding sex when partners are intoxicated may be an especially valuable goal of counseling sessions. PMID:27875903
Crosby, Richard A; Mena, Leandro; Ricks, JaNelle M
2017-06-01
This study applied an 8-item index of recent sexual-risk behaviors to young Black men who have sex with men (YBMSM) and evaluated the distribution for normality. The distribution was tested for associations with possible antecedents of sexual risk. YBMSM (N = 600), aged 16-29 years, were recruited from a sexually transmitted infection clinic, located in the southern US. Men completed an extensive audio computer-assisted self-interview. Thirteen possible antecedents of sexual risk, as assessed by the index, were selected for analyses. The 8-item index formed a normal distribution with a mean of 4.77 (SD = 1.77). In adjusted analyses, not having completed education beyond high school was associated with less risk, as was having sex with females. Conversely, meeting sex partners online was associated with greater risk, as was reporting that sex partners were drunk during sex. The obtained normal distribution of sexual-risk behaviors suggests a corresponding need to "target and tailor" clinic-based counseling and prevention services for YBMSM. Avoiding sex when partners are intoxicated may be an especially valuable goal of counseling sessions.
NASA Astrophysics Data System (ADS)
Hubert, G.; Federico, C. A.; Pazianotto, M. T.; Gonzales, O. L.
2016-02-01
In this paper are described the ACROPOL and OPD high-altitude stations devoted to characterize the atmospheric radiation fields. The ACROPOL platform, located at the summit of the Pic du Midi in the French Pyrenees at 2885 m above sea level, exploits since May 2011 some scientific equipment, including a BSS neutron spectrometer, detectors based on semiconductor and scintillators. In the framework of a IEAv and ONERA collaboration, a second neutron spectrometer was simultaneously exploited since February 2015 at the summit of the Pico dos Dias in Brazil at 1864 m above the sea level. The both high station platforms allow for investigating the long period dynamics to analyze the spectral variation of cosmic-ray- induced neutron and effects of local and seasonal changes, but also the short term dynamics during solar flare events. This paper presents long and short-term analyses, including measurement and modeling investigations considering the both high altitude stations data. The modeling approach, based on ATMORAD computational platform, was used to link the both station measurements.
Generation of an Aerothermal Data Base for the X33 Spacecraft
NASA Technical Reports Server (NTRS)
Roberts, Cathy; Huynh, Loc
1998-01-01
The X-33 experimental program is a cooperative program between industry and NASA, managed by Lockheed-Martin Skunk Works to develop an experimental vehicle to demonstrate new technologies for a single-stage-to-orbit, fully reusable launch vehicle (RLV). One of the new technologies to be demonstrated is an advanced Thermal Protection System (TPS) being designed by BF Goodrich (formerly Rohr, Inc.) with support from NASA. The calculation of an aerothermal database is crucial to identifying the critical design environment data for the TPS. The NASA Ames X-33 team has generated such a database using Computational Fluid Dynamics (CFD) analyses, engineering analysis methods and various programs to compare and interpolate the results from the CFD and the engineering analyses. This database, along with a program used to query the database, is used extensively by several X-33 team members to help them in designing the X-33. This paper will describe the methods used to generate this database, the program used to query the database, and will show some of the aerothermal analysis results for the X-33 aircraft.
NASA Technical Reports Server (NTRS)
Weeks, Cindy Lou
1986-01-01
Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.
No Evidence for Extensions to the Standard Cosmological Model.
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-08
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (ΛCDM) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (lnB=-7.8), nonzero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), nonstandard numbers of neutrinos (lnB=-3.1), nonstandard neutrino masses (lnB=-3.2), nonstandard lensing potential (lnB=-4.6), evolving dark energy (lnB=-3.2), sterile neutrinos (lnB=-6.9), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (lnB=-10.8). Other models are less strongly disfavored with respect to flat ΛCDM. As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does ΛCDM become disfavored, and only mildly, compared with a dynamical dark energy model (lnB∼+2).
No Evidence for Extensions to the Standard Cosmological Model
NASA Astrophysics Data System (ADS)
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-01
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).
The transformative potential of an integrative approach to pregnancy.
Eidem, Haley R; McGary, Kriston L; Capra, John A; Abbot, Patrick; Rokas, Antonis
2017-09-01
Complex traits typically involve diverse biological pathways and are shaped by numerous genetic and environmental factors. Pregnancy-associated traits and pathologies are further complicated by extensive communication across multiple tissues in two individuals, interactions between two genomes-maternal and fetal-that obscure causal variants and lead to genetic conflict, and rapid evolution of pregnancy-associated traits across mammals and in the human lineage. Given the multi-faceted complexity of human pregnancy, integrative approaches that synthesize diverse data types and analyses harbor tremendous promise to identify the genetic architecture and environmental influences underlying pregnancy-associated traits and pathologies. We review current research that addresses the extreme complexities of traits and pathologies associated with human pregnancy. We find that successful efforts to address the many complexities of pregnancy-associated traits and pathologies often harness the power of many and diverse types of data, including genome-wide association studies, evolutionary analyses, multi-tissue transcriptomic profiles, and environmental conditions. We propose that understanding of pregnancy and its pathologies will be accelerated by computational platforms that provide easy access to integrated data and analyses. By simplifying the integration of diverse data, such platforms will provide a comprehensive synthesis that transcends many of the inherent challenges present in studies of pregnancy. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Most universities have invested in extensive infrastructure in the form of computer laboratories and computer kiosks. However, is this investment justified when it is suggested that students work predominantly from home using their own computers? This paper provides an empirical study investigating how students at a regional multi-campus…
Good Practices for Learning to Recognize Actions Using FV and VLAD.
Wu, Jianxin; Zhang, Yu; Lin, Weiyao
2016-12-01
High dimensional representations such as Fisher vectors (FV) and vectors of locally aggregated descriptors (VLAD) have shown state-of-the-art accuracy for action recognition in videos. The high dimensionality, on the other hand, also causes computational difficulties when scaling up to large-scale video data. This paper makes three lines of contributions to learning to recognize actions using high dimensional representations. First, we reviewed several existing techniques that improve upon FV or VLAD in image classification, and performed extensive empirical evaluations to assess their applicability for action recognition. Our analyses of these empirical results show that normality and bimodality are essential to achieve high accuracy. Second, we proposed a new pooling strategy for VLAD and three simple, efficient, and effective transformations for both FV and VLAD. Both proposed methods have shown higher accuracy than the original FV/VLAD method in extensive evaluations. Third, we proposed and evaluated new feature selection and compression methods for the FV and VLAD representations. This strategy uses only 4% of the storage of the original representation, but achieves comparable or even higher accuracy. Based on these contributions, we recommend a set of good practices for action recognition in videos for practitioners in this field.
MannDB: A microbial annotation database for protein characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, C; Lam, M; Smith, J
2006-05-19
MannDB was created to meet a need for rapid, comprehensive automated protein sequence analyses to support selection of proteins suitable as targets for driving the development of reagents for pathogen or protein toxin detection. Because a large number of open-source tools were needed, it was necessary to produce a software system to scale the computations for whole-proteome analysis. Thus, we built a fully automated system for executing software tools and for storage, integration, and display of automated protein sequence analysis and annotation data. MannDB is a relational database that organizes data resulting from fully automated, high-throughput protein-sequence analyses using open-sourcemore » tools. Types of analyses provided include predictions of cleavage, chemical properties, classification, features, functional assignment, post-translational modifications, motifs, antigenicity, and secondary structure. Proteomes (lists of hypothetical and known proteins) are downloaded and parsed from Genbank and then inserted into MannDB, and annotations from SwissProt are downloaded when identifiers are found in the Genbank entry or when identical sequences are identified. Currently 36 open-source tools are run against MannDB protein sequences either on local systems or by means of batch submission to external servers. In addition, BLAST against protein entries in MvirDB, our database of microbial virulence factors, is performed. A web client browser enables viewing of computational results and downloaded annotations, and a query tool enables structured and free-text search capabilities. When available, links to external databases, including MvirDB, are provided. MannDB contains whole-proteome analyses for at least one representative organism from each category of biological threat organism listed by APHIS, CDC, HHS, NIAID, USDA, USFDA, and WHO. MannDB comprises a large number of genomes and comprehensive protein sequence analyses representing organisms listed as high-priority agents on the websites of several governmental organizations concerned with bio-terrorism. MannDB provides the user with a BLAST interface for comparison of native and non-native sequences and a query tool for conveniently selecting proteins of interest. In addition, the user has access to a web-based browser that compiles comprehensive and extensive reports.« less
A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.
ERIC Educational Resources Information Center
Stevenson, John F.; And Others
Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
D'iachkova, G V; Mitina, Iu L
2007-01-01
Based on the data of computed tomography, radiography and densitometry in 39 patients the authors describe in detail the signs of osteonecrosis and sequestration of different localization and extension.
DInSAR time series generation within a cloud computing environment: from ERS to Sentinel-1 scenario
NASA Astrophysics Data System (ADS)
Casu, Francesco; Elefante, Stefano; Imperatore, Pasquale; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana; Mathot, Emmanuel; Brito, Fabrice; Farres, Jordi; Lengert, Wolfgang
2013-04-01
One of the techniques that will strongly benefit from the advent of the Sentinel-1 system is Differential SAR Interferometry (DInSAR), which has successfully demonstrated to be an effective tool to detect and monitor ground displacements with centimetre accuracy. The geoscience communities (volcanology, seismicity, …), as well as those related to hazard monitoring and risk mitigation, make extensively use of the DInSAR technique and they will take advantage from the huge amount of SAR data acquired by Sentinel-1. Indeed, such an information will successfully permit the generation of Earth's surface displacement maps and time series both over large areas and long time span. However, the issue of managing, processing and analysing the large Sentinel data stream is envisaged by the scientific community to be a major bottleneck, particularly during crisis phases. The emerging need of creating a common ecosystem in which data, results and processing tools are shared, is envisaged to be a successful way to address such a problem and to contribute to the information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP) and the ESA Cloud Computing Operational Pilot (CIOP) projects provide effective answers to this need and they are pushing towards the development of such an ecosystem. It is clear that all the current and existent tools for querying, processing and analysing SAR data are required to be not only updated for managing the large data stream of Sentinel-1 satellite, but also reorganized for quickly replying to the simultaneous and highly demanding user requests, mainly during emergency situations. This translates into the automatic and unsupervised processing of large amount of data as well as the availability of scalable, widely accessible and high performance computing capabilities. The cloud computing environment permits to achieve all of these objectives, particularly in case of spike and peak requests of processing resources linked to disaster events. This work aims at presenting a parallel computational model for the widely used DInSAR algorithm named as Small BAseline Subset (SBAS), which has been implemented within the cloud computing environment provided by the ESA-CIOP platform. This activity has resulted in developing a scalable, unsupervised, portable, and widely accessible (through a web portal) parallel DInSAR computational tool. The activity has rewritten and developed the SBAS application algorithm within a parallel system environment, i.e., in a form that allows us to benefit from multiple processing units. This requires the devising a parallel version of the SBAS algorithm and its subsequent implementation, implying additional complexity in algorithm designing and an efficient multi processor programming, with the final aim of a parallel performance optimization. Although the presented algorithm has been designed to work with Sentinel-1 data, it can also process other satellite SAR data (ERS, ENVISAT, CSK, TSX, ALOS). Indeed, the performance analysis of the implemented SBAS parallel version has been tested on the full ASAR archive (64 acquisitions) acquired over the Napoli Bay, a volcanic and densely urbanized area in Southern Italy. The full processing - from the raw data download to the generation of DInSAR time series - has been carried out by engaging 4 nodes, each one with 2 cores and 16 GB of RAM, and has taken about 36 hours, with respect to about 135 hours of the sequential version. Extensive analysis on other test areas significant from DInSAR and geophysical viewpoint will be presented. Finally, preliminary performance evaluation of the presented approach within the Sentinel-1 scenario will be provided.
ERIC Educational Resources Information Center
Huang, Xi
2018-01-01
Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC): real-time communication that takes place between human beings…
Using E-Learning and ICT Courses in Educational Environment: A Review
ERIC Educational Resources Information Center
Salehi, Hadi; Shojaee, Mohammad; Sattar, Susan
2015-01-01
With the quick emergence of computers and related technology, Electronic-learning (E-learning) and Information Communication and Technology (ICT) have been extensively utilized in the education and training field. Miscellaneous methods of integrating computer technology and the context in which computers are used have affected student learning in…
Computer Power: Part 1: Distribution of Power (and Communications).
ERIC Educational Resources Information Center
Price, Bennett J.
1988-01-01
Discussion of the distribution of power to personal computers and computer terminals addresses options such as extension cords, perimeter raceways, and interior raceways. Sidebars explain: (1) the National Electrical Code; (2) volts, amps, and watts; (3) transformers, circuit breakers, and circuits; and (4) power vs. data wiring. (MES)
Computational Participation: Understanding Coding as an Extension of Literacy Instruction
ERIC Educational Resources Information Center
Burke, Quinn; O'Byrne, W. Ian; Kafai, Yasmin B.
2016-01-01
Understanding the computational concepts on which countless digital applications run offers learners the opportunity to no longer simply read such media but also become more discerning end users and potentially innovative "writers" of new media themselves. To think computationally--to solve problems, to design systems, and to process and…
2015-04-27
from waste biomass using these two high temperature reactors. We have extensively used a Raman spectrometer to analyse as synthesized carbon materials...corporation). These tools were fully installed and operational. We have also synthesized carbon materials from waste biomass using these two high...materials from waste biomass using these two high temperature reactors. We have extensively used a Raman spectrometer to analyse as synthesized carbon
BCILAB: a platform for brain-computer interface development
NASA Astrophysics Data System (ADS)
Kothe, Christian Andreas; Makeig, Scott
2013-10-01
Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.
Mobile Applications for Extension
ERIC Educational Resources Information Center
Drill, Sabrina L.
2012-01-01
Mobile computing devices (smart phones, tablets, etc.) are rapidly becoming the dominant means of communication worldwide and are increasingly being used for scientific investigation. This technology can further our Extension mission by increasing our power for data collection, information dissemination, and informed decision-making. Mobile…
The Effect of Home Computer Use on Children's Cognitive and Non-Cognitive Skills
ERIC Educational Resources Information Center
Fiorini, M.
2010-01-01
In this paper we investigate the effect of using a home computer on children's development. In most OECD countries 70% or more of the households have a computer at home and children use computers quite extensively, even at very young ages. We use data from the Longitudinal Study of Australian Children (LSAC), which follows an Australian cohort…
NASA Astrophysics Data System (ADS)
Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.
2016-12-01
Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Application of Computer-Assisted Learning Methods in the Teaching of Chemical Spectroscopy.
ERIC Educational Resources Information Center
Ayscough, P. B.; And Others
1979-01-01
Discusses the application of computer-assisted learning methods to the interpretation of infrared, nuclear magnetic resonance, and mass spectra; and outlines extensions into the area of integrated spectroscopy. (Author/CMV)
Platform-independent method for computer aided schematic drawings
Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL
2012-02-14
A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.
ERIC Educational Resources Information Center
Sun, Zhong; Yang, Xian Min; He, Ke Kang
2016-01-01
The rapid development of the digital classroom has made it possible to combine extensive reading with online writing, yet research and development in this area are lacking. This study explores the impact of online writing after extensive reading in a classroom setting in China where there was one computer for each student (a 1:1 digital…
A review of recent wake vortex research for increasing airport capacity
NASA Astrophysics Data System (ADS)
Hallock, James N.; Holzäpfel, Frank
2018-04-01
This paper is a brief review of recent wake vortex research as it affects the operational problem of spacing aircraft to increase airport capacity and throughput. The paper addresses the questions of what do we know about wake vortices and what don't we know about wake vortices. The introduction of Heavy jets in the late 1960s stimulated the study of wake vortices for safety reasons and the use of pulsed lidars and the maturity of computational fluid dynamics in the last three decades have led to extensive data collection and analyses which are now resulting in the development and implementation of systems to safely decrease separations in the terminal environment. Although much has been learned about wake vortices and their behavior, there is still more to be learned about the phenomena of aircraft wake vortices.
Accurate Phylogenetic Tree Reconstruction from Quartets: A Heuristic Approach
Reaz, Rezwana; Bayzid, Md. Shamsuzzoha; Rahman, M. Sohel
2014-01-01
Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A ‘quartet’ is an unrooted tree over taxa, hence the quartet-based supertree methods combine many -taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets. PMID:25117474
NASA Astrophysics Data System (ADS)
Bucher, François-Xavier; Cao, Frédéric; Viard, Clément; Guichard, Frédéric
2014-03-01
We present in this paper a novel capacitive device that stimulates the touchscreen interface of a smartphone (or of any imaging device equipped with a capacitive touchscreen) and synchronizes triggering with the DxO LED Universal Timer to measure shooting time lag and shutter lag according to ISO 15781:2013. The device and protocol extend the time lag measurement beyond the standard by including negative shutter lag, a phenomenon that is more and more commonly found in smartphones. The device is computer-controlled, and this feature, combined with measurement algorithms, makes it possible to automatize a large series of captures so as to provide more refined statistical analyses when, for example, the shutter lag of "zero shutter lag" devices is limited by the frame time as our measurements confirm.
Hofstadter-Duke, Kristi L; Daly, Edward J
2015-03-01
This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.
Multi-GPU and multi-CPU accelerated FDTD scheme for vibroacoustic applications
NASA Astrophysics Data System (ADS)
Francés, J.; Otero, B.; Bleda, S.; Gallego, S.; Neipp, C.; Márquez, A.; Beléndez, A.
2015-06-01
The Finite-Difference Time-Domain (FDTD) method is applied to the analysis of vibroacoustic problems and to study the propagation of longitudinal and transversal waves in a stratified media. The potential of the scheme and the relevance of each acceleration strategy for massively computations in FDTD are demonstrated in this work. In this paper, we propose two new specific implementations of the bi-dimensional scheme of the FDTD method using multi-CPU and multi-GPU, respectively. In the first implementation, an open source message passing interface (OMPI) has been included in order to massively exploit the resources of a biprocessor station with two Intel Xeon processors. Moreover, regarding CPU code version, the streaming SIMD extensions (SSE) and also the advanced vectorial extensions (AVX) have been included with shared memory approaches that take advantage of the multi-core platforms. On the other hand, the second implementation called the multi-GPU code version is based on Peer-to-Peer communications available in CUDA on two GPUs (NVIDIA GTX 670). Subsequently, this paper presents an accurate analysis of the influence of the different code versions including shared memory approaches, vector instructions and multi-processors (both CPU and GPU) and compares them in order to delimit the degree of improvement of using distributed solutions based on multi-CPU and multi-GPU. The performance of both approaches was analysed and it has been demonstrated that the addition of shared memory schemes to CPU computing improves substantially the performance of vector instructions enlarging the simulation sizes that use efficiently the cache memory of CPUs. In this case GPU computing is slightly twice times faster than the fine tuned CPU version in both cases one and two nodes. However, for massively computations explicit vector instructions do not worth it since the memory bandwidth is the limiting factor and the performance tends to be the same than the sequential version with auto-vectorisation and also shared memory approach. In this scenario GPU computing is the best option since it provides a homogeneous behaviour. More specifically, the speedup of GPU computing achieves an upper limit of 12 for both one and two GPUs, whereas the performance reaches peak values of 80 GFlops and 146 GFlops for the performance for one GPU and two GPUs respectively. Finally, the method is applied to an earth crust profile in order to demonstrate the potential of our approach and the necessity of applying acceleration strategies in these type of applications.
Practical Algorithms for the Longest Common Extension Problem
NASA Astrophysics Data System (ADS)
Ilie, Lucian; Tinta, Liviu
The Longest Common Extension problem considers a string s and computes, for each of a number of pairs (i,j), the longest substring of s that starts at both i and j. It appears as a subproblem in many fundamental string problems and can be solved by linear-time preprocessing of the string that allows (worst-case) constant-time computation for each pair. The two known approaches use powerful algorithms: either constant-time computation of the Lowest Common Ancestor in trees or constant-time computation of Range Minimum Queries (RMQ) in arrays. We show here that, from practical point of view, such complicated approaches are not needed. We give two very simple algorithms for this problem that require no preprocessing. The first needs only the string and is significantly faster than all previous algorithms on the average. The second combines the first with a direct RMQ computation on the Longest Common Prefix array. It takes advantage of the superior speed of the cache memory and is the fastest on virtually all inputs.
NASA Astrophysics Data System (ADS)
Lindsey, Charles G.; Chen, Jun; Dye, Timothy S.; Willard Richards, L.; Blumenthal, Donald L.
1999-08-01
During the 1990 Navajo Generating Station (NGS) Winter Visibility Study, a network of surface and upper-air meteorological measurement systems was operated in and around Grand Canyon National Park to investigate atmospheric processes in complex terrain that affected the transport of emissions from the nearby NGS. This network included 15 surface monitoring stations, eight balloon sounding stations (equipped with a mix of rawinsonde, tethersonde, and Airsonde sounding systems), three Doppler radar wind profilers, and four Doppler sodars. Measurements were made from 10 January through 31 March 1990. Data from this network were used to prepare objectively analyzed wind fields, trajectories, and streak lines to represent transport of emissions from the NGS, and to prepare isentropic analyses of the data. The results of these meteorological analyses were merged in the form of a computer animation that depicted the streak line analyses along with measurements of perfluorocarbon tracer, SO2, and sulfate aerosol concentrations, as well as visibility measurements collected by an extensive surface monitoring network. These analyses revealed that synoptic-scale circulations associated with the passage of low pressure systems followed by the formation of high pressure ridges accompanied the majority of cases when NGS emittants appeared to be transported to the Grand Canyon. The authors' results also revealed terrain influences on transport within the topography of the study area, especially mesoscale flows inside the Lake Powell basin and along the plain above the Marble Canyon.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
2016-07-15
AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a. CONTRACT NUMBER 5b. GRANT NUMBER... electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study
2016-07-15
AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a. CONTRACT NUMBER 5b. GRANT NUMBER...electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study
NASA Technical Reports Server (NTRS)
Meston, R. D.; Schall, M. R., Jr.; Brockman, C. L.; Bender, W. H.
1972-01-01
All analyses and tradeoffs conducted to establish the MSS operations and crew activities are discussed. The missions and subsystem integrated analyses that were completed to assure compatibility of program elements and consistency with program objectives are presented.
Covering Resilience: A Recent Development for Binomial Checkpointing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Andrea; Narayanan, Sri Hari Krishna
In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, required, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algorithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massive parallel simulationsmore » and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We describe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding implementation and discuss first numerical results.« less
NASA Technical Reports Server (NTRS)
Gupta, Kajal K.
1991-01-01
The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.
Using a software-defined computer in teaching the basics of computer architecture and operation
NASA Astrophysics Data System (ADS)
Kosowska, Julia; Mazur, Grzegorz
2017-08-01
The paper describes the concept and implementation of SDC_One software-defined computer designed for experimental and didactic purposes. Equipped with extensive hardware monitoring mechanisms, the device enables the students to monitor the computer's operation on bus transfer cycle or instruction cycle basis, providing the practical illustration of basic aspects of computer's operation. In the paper, we describe the hardware monitoring capabilities of SDC_One and some scenarios of using it in teaching the basics of computer architecture and microprocessor operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, C.H.; Ready, A.B.; Rea, J.
1995-06-01
Versions of the computer program PROATES (PROcess Analysis for Thermal Energy Systems) have been used since 1979 to analyse plant performance improvement proposals relating to existing plant and also to evaluate new plant designs. Several plant modifications have been made to improve performance based on the model predictions and the predicted performance has been realised in practice. The program was born out of a need to model the overall steady state performance of complex plant to enable proposals to change plant component items or operating strategy to be evaluated. To do this with confidence it is necessary to model themore » multiple thermodynamic interactions between the plant components. The modelling system is modular in concept allowing the configuration of individual plant components to represent any particular power plant design. A library exists of physics based modules which have been extensively validated and which provide representations of a wide range of boiler, turbine and CW system components. Changes to model data and construction is achieved via a user friendly graphical model editing/analysis front-end with results being presented via the computer screen or hard copy. The paper describes briefly the modelling system but concentrates mainly on the application of the modelling system to assess design re-optimisation, firing with different fuels and the re-powering of an existing plant.« less
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
ArrayBridge: Interweaving declarative array processing with high-performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros
Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less
Strbac, V; Pierce, D M; Vander Sloten, J; Famaey, N
2017-12-01
Finite element (FE) simulations are increasingly valuable in assessing and improving the performance of biomedical devices and procedures. Due to high computational demands such simulations may become difficult or even infeasible, especially when considering nearly incompressible and anisotropic material models prevalent in analyses of soft tissues. Implementations of GPGPU-based explicit FEs predominantly cover isotropic materials, e.g. the neo-Hookean model. To elucidate the computational expense of anisotropic materials, we implement the Gasser-Ogden-Holzapfel dispersed, fiber-reinforced model and compare solution times against the neo-Hookean model. Implementations of GPGPU-based explicit FEs conventionally rely on single-point (under) integration. To elucidate the expense of full and selective-reduced integration (more reliable) we implement both and compare corresponding solution times against those generated using underintegration. To better understand the advancement of hardware, we compare results generated using representative Nvidia GPGPUs from three recent generations: Fermi (C2075), Kepler (K20c), and Maxwell (GTX980). We explore scaling by solving the same boundary value problem (an extension-inflation test on a segment of human aorta) with progressively larger FE meshes. Our results demonstrate substantial improvements in simulation speeds relative to two benchmark FE codes (up to 300[Formula: see text] while maintaining accuracy), and thus open many avenues to novel applications in biomechanics and medicine.
Simulation of rotor blade element turbulence
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.; Duisenberg, Ken
1995-01-01
A piloted, motion-based simulation of Sikorsky's Black Hawk helicopter was used as a platform for the investigation of rotorcraft responses to vertical turbulence. By using an innovative temporal and geometrical distribution algorithm that preserved the statistical characteristics of the turbulence over the rotor disc, stochastic velocity components were applied at each of twenty blade-element stations. This model was implemented on NASA Ames' Vertical Motion Simulator (VMS), and ten test pilots were used to establish that the model created realistic cues. The objectives of this research included the establishment of a simulation-technology basis for future investigation into real-time turbulence modeling. This goal was achieved; our extensive additions to the rotor model added less than a 10 percent computational overhead. Using a VAX 9000 computer the entire simulation required a cycle time of less than 12 msec. Pilot opinion during this simulation was generally quite favorable. For low speed flight the consensus was that SORBET (acronym for title) was better than the conventional body-fixed model, which was used for comparison purposes, and was determined to be too violent (like a washboard). For high speed flight the pilots could not identify differences between these models. These opinions were something of a surprise because only the vertical turbulence component on the rotor system was implemented in SORBET. Because of the finite-element distribution of the inputs, induced outputs were observed in all translational and rotational axes. Extensive post-simulation spectral analyses of the SORBET model suggest that proper rotorcraft turbulence modeling requires that vertical atmospheric disturbances not be superimposed at the vehicle center of gravity but, rather, be input into the rotor system, where the rotor-to-body transfer function severely attenuates high frequency rotorcraft responses.
Robotics-Centered Outreach Activities: An Integrated Approach
ERIC Educational Resources Information Center
Ruiz-del-Solar, Javier
2010-01-01
Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…
Tools for Creating Mobile Applications for Extension
ERIC Educational Resources Information Center
Drill, Sabrina L.
2012-01-01
Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…
ERIC Educational Resources Information Center
Micro-Ideas, Glenview, IL.
Fifty-five papers focusing on the role of computer technology in education at all levels are included in the proceedings of this conference, which was designed to model effective and appropriate uses of the computer as an extension of the teacher-based instructional system. The use of the computer as a tool was emphasized, and the word processor…
2010-03-01
functionality and plausibility distinguishes this research from most research in computational linguistics and computational psycholinguistics . The... Psycholinguistic Theory There is extensive psycholinguistic evidence that human language processing is essentially incremental and interactive...challenges of psycholinguistic research is to explain how humans can process language effortlessly and accurately given the complexity and ambiguity that is
ERIC Educational Resources Information Center
Wildmon, Mark E.; Skinner, Christopher H.; Watson, T. Steuart; Garrett, L. Shan
2004-01-01
Active student responding is often required to remedy computation skill deficits in students with learning disabilities. However, these students may find computation assignments unrewarding and frustrating, and be less likely to choose to engage in assigned computation tasks. In the current study, middle school students with learning disabilities…
Factors Affecting Career Choice: Comparison between Students from Computer and Other Disciplines
ERIC Educational Resources Information Center
Alexander, P. M.; Holmner, M.; Lotriet, H. H.; Matthee, M. C.; Pieterse, H. V.; Naidoo, S.; Twinomurinzi, H.; Jordaan, D.
2011-01-01
The number of student enrolments in computer-related courses remains a serious concern worldwide with far reaching consequences. This paper reports on an extensive survey about career choice and associated motivational factors amongst new students, only some of whom intend to major in computer-related courses, at two South African universities.…
NASA Technical Reports Server (NTRS)
Gerber, C. R.
1972-01-01
The development of uniform computer program standards and conventions for the modular space station is discussed. The accomplishments analyzed are: (1) development of computer program specification hierarchy, (2) definition of computer program development plan, and (3) recommendations for utilization of all operating on-board space station related data processing facilities.
ERIC Educational Resources Information Center
Demir, Seda; Basol, Gülsah
2014-01-01
The aim of the current study is to determine the overall effects of Computer-Assisted Mathematics Education (CAME) on academic achievement. After an extensive review of the literature, studies using Turkish samples and observing the effects of Computer-Assisted Education (CAE) on mathematics achievement were examined. As a result of this…
A toolbox for discrete modelling of cell signalling dynamics.
Paterson, Yasmin Z; Shorthouse, David; Pleijzier, Markus W; Piterman, Nir; Bendtsen, Claus; Hall, Benjamin A; Fisher, Jasmin
2018-06-18
In an age where the volume of data regarding biological systems exceeds our ability to analyse it, many researchers are looking towards systems biology and computational modelling to help unravel the complexities of gene and protein regulatory networks. In particular, the use of discrete modelling allows generation of signalling networks in the absence of full quantitative descriptions of systems, which are necessary for ordinary differential equation (ODE) models. In order to make such techniques more accessible to mainstream researchers, tools such as the BioModelAnalyzer (BMA) have been developed to provide a user-friendly graphical interface for discrete modelling of biological systems. Here we use the BMA to build a library of discrete target functions of known canonical molecular interactions, translated from ordinary differential equations (ODEs). We then show that these BMA target functions can be used to reconstruct complex networks, which can correctly predict many known genetic perturbations. This new library supports the accessibility ethos behind the creation of BMA, providing a toolbox for the construction of complex cell signalling models without the need for extensive experience in computer programming or mathematical modelling, and allows for construction and simulation of complex biological systems with only small amounts of quantitative data.
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Bakas, Spyridon; Akbari, Hamed; Sotiras, Aristeidis; Bilello, Michel; Rozycki, Martin; Kirby, Justin S.; Freymann, John B.; Farahani, Keyvan; Davatzikos, Christos
2017-01-01
Gliomas belong to a group of central nervous system tumors, and consist of various sub-regions. Gold standard labeling of these sub-regions in radiographic imaging is essential for both clinical and computational studies, including radiomic and radiogenomic analyses. Towards this end, we release segmentation labels and radiomic features for all pre-operative multimodal magnetic resonance imaging (MRI) (n=243) of the multi-institutional glioma collections of The Cancer Genome Atlas (TCGA), publicly available in The Cancer Imaging Archive (TCIA). Pre-operative scans were identified in both glioblastoma (TCGA-GBM, n=135) and low-grade-glioma (TCGA-LGG, n=108) collections via radiological assessment. The glioma sub-region labels were produced by an automated state-of-the-art method and manually revised by an expert board-certified neuroradiologist. An extensive panel of radiomic features was extracted based on the manually-revised labels. This set of labels and features should enable i) direct utilization of the TCGA/TCIA glioma collections towards repeatable, reproducible and comparative quantitative studies leading to new predictive, prognostic, and diagnostic assessments, as well as ii) performance evaluation of computer-aided segmentation methods, and comparison to our state-of-the-art method. PMID:28872634
Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures
NASA Technical Reports Server (NTRS)
Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.
1994-01-01
A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.
Atmospheric energetics as related to cyclogenesis over the eastern United States. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
West, P. W.
1973-01-01
A method is presented to investigate the atmospheric energy budget as related to cyclogenesis. Energy budget equations are developed that are shown to be advantageous because the individual terms represent basic physical processes which produce changes in atmospheric energy, and the equations provide a means to study the interaction of the cyclone with the larger scales of motion. The work presented represents an extension of previous studies because all of the terms of the energy budget equations were evaluated throughout the development period of the cyclone. Computations are carried out over a limited atmospheric volume which encompasses the cyclone, and boundary fluxes of energy that were ignored in most previous studies are evaluated. Two examples of cyclogenesis over the eastern United States were chosen for study. One of the cases (1-4 November, 1966) represented an example of vigorous development, while the development in the other case (5-8 December, 1969) was more modest. Objectively analyzed data were used in the evaluation of the energy budget terms in order to minimize computational errors, and an objective analysis scheme is described that insures that all of the resolution contained in the rawinsonde observations is incorporated in the analyses.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
Isewon, Itunuoluwa; Aromolaran, Olufemi; Oladipupo, Olufunke
2018-01-01
Malaria is an infectious disease that affects close to half a million individuals every year and Plasmodium falciparum is a major cause of malaria. The treatment of this disease could be done effectively if the essential enzymes of this parasite are specifically targeted. Nevertheless, the development of the parasite in resisting existing drugs now makes discovering new drugs a core responsibility. In this study, a novel computational model that makes the prediction of new and validated antimalarial drug target cheaper, easier, and faster has been developed. We have identified new essential reactions as potential targets for drugs in the metabolic network of the parasite. Among the top seven (7) predicted essential reactions, four (4) have been previously identified in earlier studies with biological evidence and one (1) has been with computational evidence. The results from our study were compared with an extensive list of seventy-seven (77) essential reactions with biological evidence from a previous study. We present a list of thirty-one (31) potential candidates for drug targets in Plasmodium falciparum which includes twenty-four (24) new potential candidates for drug targets. PMID:29789805
Development of iterative techniques for the solution of unsteady compressible viscous flows
NASA Technical Reports Server (NTRS)
Hixon, Duane; Sankar, L. N.
1993-01-01
During the past two decades, there has been significant progress in the field of numerical simulation of unsteady compressible viscous flows. At present, a variety of solution techniques exist such as the transonic small disturbance analyses (TSD), transonic full potential equation-based methods, unsteady Euler solvers, and unsteady Navier-Stokes solvers. These advances have been made possible by developments in three areas: (1) improved numerical algorithms; (2) automation of body-fitted grid generation schemes; and (3) advanced computer architectures with vector processing and massively parallel processing features. In this work, the GMRES scheme has been considered as a candidate for acceleration of a Newton iteration time marching scheme for unsteady 2-D and 3-D compressible viscous flow calculation; from preliminary calculations, this will provide up to a 65 percent reduction in the computer time requirements over the existing class of explicit and implicit time marching schemes. The proposed method has ben tested on structured grids, but is flexible enough for extension to unstructured grids. The described scheme has been tested only on the current generation of vector processor architecture of the Cray Y/MP class, but should be suitable for adaptation to massively parallel machines.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph; Kopasakis, George
2016-01-01
An overview of recent applications of the FUN3D CFD code to computational aeroelastic, sonic boom, and aeropropulsoservoelasticity (APSE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed including multiple unstructured CFD grids suitable for aeroelastic and sonic boom analyses. In addition, aeroelastic Reduced-Order Models (ROMs) are generated and used to rapidly compute the aeroelastic response and utter boundaries at multiple flight conditions.
NASA Astrophysics Data System (ADS)
Fleury, Gérard; Mistrot, Pierre
2006-12-01
While driving off-road vehicles, operators are exposed to whole-body vibration acting in the fore-and-aft direction. Seat manufacturers supply products equipped with fore-and-aft suspension but only a few studies report on their performance. This work proposes a computational approach to design fore-and-aft suspensions for wheel loader seats. Field tests were conducted in a quarry to analyse the nature of vibration to which the driver was exposed. Typical input signals were recorded to be reproduced in the laboratory. Technical specifications are defined for the suspension. In order to evaluate the suspension vibration attenuation performance, a model of a sitting human body was developed and coupled to a seat model. The seat model combines the models of each suspension component. A linear two-degree-of-freedom model is used to describe the dynamic behaviour of the sitting driver. Model parameters are identified by fitting the computed apparent mass frequency response functions to the measured values. Model extensions are proposed to investigate postural effects involving variations in hands and feet positions and interaction of the driver's back with the backrest. Suspension design parameters are firstly optimized by computing the seat/man model response to sinusoidal acceleration. Four criteria including transmissibility, interaction force between the driver's back and the backrest and relative maximal displacement of the suspension are computed. A new suspension design with optimized features is proposed. Its performance is checked from calculations of the response of the seat/man model subjected to acceleration measured on the wheel loader during real work conditions. On the basis of the computed values of the SEAT factors, it is found possible to design a suspension that would increase the attenuation provided by the seat by a factor of two.
Lessons learnt on the analysis of large sequence data in animal genomics.
Biscarini, F; Cozzi, P; Orozco-Ter Wengel, P
2018-04-06
The 'omics revolution has made a large amount of sequence data available to researchers and the industry. This has had a profound impact in the field of bioinformatics, stimulating unprecedented advancements in this discipline. Mostly, this is usually looked at from the perspective of human 'omics, in particular human genomics. Plant and animal genomics, however, have also been deeply influenced by next-generation sequencing technologies, with several genomics applications now popular among researchers and the breeding industry. Genomics tends to generate huge amounts of data, and genomic sequence data account for an increasing proportion of big data in biological sciences, due largely to decreasing sequencing and genotyping costs and to large-scale sequencing and resequencing projects. The analysis of big data poses a challenge to scientists, as data gathering currently takes place at a faster pace than does data processing and analysis, and the associated computational burden is increasingly taxing, making even simple manipulation, visualization and transferring of data a cumbersome operation. The time consumed by the processing and analysing of huge data sets may be at the expense of data quality assessment and critical interpretation. Additionally, when analysing lots of data, something is likely to go awry-the software may crash or stop-and it can be very frustrating to track the error. We herein review the most relevant issues related to tackling these challenges and problems, from the perspective of animal genomics, and provide researchers that lack extensive computing experience with guidelines that will help when processing large genomic data sets. © 2018 Stichting International Foundation for Animal Genetics.
“Guilt by Association” Is the Exception Rather Than the Rule in Gene Networks
Gillis, Jesse; Pavlidis, Paul
2012-01-01
Gene networks are commonly interpreted as encoding functional information in their connections. An extensively validated principle called guilt by association states that genes which are associated or interacting are more likely to share function. Guilt by association provides the central top-down principle for analyzing gene networks in functional terms or assessing their quality in encoding functional information. In this work, we show that functional information within gene networks is typically concentrated in only a very few interactions whose properties cannot be reliably related to the rest of the network. In effect, the apparent encoding of function within networks has been largely driven by outliers whose behaviour cannot even be generalized to individual genes, let alone to the network at large. While experimentalist-driven analysis of interactions may use prior expert knowledge to focus on the small fraction of critically important data, large-scale computational analyses have typically assumed that high-performance cross-validation in a network is due to a generalizable encoding of function. Because we find that gene function is not systemically encoded in networks, but dependent on specific and critical interactions, we conclude it is necessary to focus on the details of how networks encode function and what information computational analyses use to extract functional meaning. We explore a number of consequences of this and find that network structure itself provides clues as to which connections are critical and that systemic properties, such as scale-free-like behaviour, do not map onto the functional connectivity within networks. PMID:22479173
Minimizing Dispersion in FDTD Methods with CFL Limit Extension
NASA Astrophysics Data System (ADS)
Sun, Chen
The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.
Weightman, Andrew; Preston, Nick; Levesley, Martin; Bhakta, Bipin; Holt, Raymond; Mon-Williams, Mark
2014-05-01
To compare upper limb kinematics of children with spastic cerebral palsy (CP) using a passive rehabilitation joystick with those of adults and able-bodied children, to better understand the design requirements of computer-based rehabilitation devices. A blocked comparative study involving seven children with spastic CP, nine able-bodied adults and nine able-bodied children, using a joystick system to play a computer game whilst the kinematics of their upper limb were recorded. The translational kinematics of the joystick's end point and the participant's shoulder movement (protraction/retraction) and elbow rotational kinematics (flexion/extension) were analysed for each group. Children with spastic CP matched their able-bodied peers in the time taken to complete the computer task, but this was due to a failure to adhere to the task instructions of travelling along a prescribed straight line when moving between targets. The spastic CP group took longer to initiate the first movement, which showed jerkier trajectories and demonstrated qualitatively different movement patterns when using the joystick, with shoulder movements that were significantly of greater magnitude than the able-bodied participants. Children with spastic CP generate large shoulder and hence trunk movements when using a joystick to undertake computer-generated arm exercises. This finding has implications for the development and use of assistive technologies to encourage exercise and the instructions given to users of such systems. A kinematic analysis of upper limb function of children with CP when using joystick devices is presented. Children with CP may use upper body movements to compensate for limitations in voluntary shoulder and elbow movements when undertaking computer games designed to encourage the practice of arm movement. The design of rehabilitative computer exercise systems should consider movement of the torso/shoulder as it may have implications for the quality of therapy in the rehabilitation of the upper limb in children with CP.
Extension, validation and application of the NASCAP code
NASA Technical Reports Server (NTRS)
Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.
1979-01-01
Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.
Lightweight computational steering of very large scale molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beazley, D.M.; Lomdahl, P.S.
1996-09-01
We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-24
... innovation and/or be familiar with the education, training, employment and management of technological... Innovations/Bioengineering and Biomedical Technology; Technology Management/Computing/IT/Manufacturing...] Extension of Period for Nominations to the National Medal of Technology and Innovation Nomination Evaluation...
Teaching XBRL to Graduate Business Students: A Hands-On Approach
ERIC Educational Resources Information Center
Pinsker, Robert
2004-01-01
EXtensible Business Reporting Language (XBRL) is a non-proprietary, computer language that has many uses. Known primarily as the Extensible Markup Language (XML) for business reporting, XBRL allows entities to report their business information (i.e., financial statements, announcements, etc.) on the Internet and communicate with other entities'…
Confined Turbulent Swirling Recirculating Flow Predictions. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Abujelala, M. T.
1984-01-01
Turbulent swirling flow, the STARPIC computer code, turbulence modeling of turbulent flows, the k-xi turbulence model and extensions, turbulence parameters deduction from swirling confined flow measurements, extension of the k-xi to confined swirling recirculating flows, and general predictions for confined turbulent swirling flow are discussed.
CBES--An Efficient Implementation of the Coursewriter Language.
ERIC Educational Resources Information Center
Franks, Edward W.
An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…
Human Spaceflight Architecture Model (HSFAM) Data Dictionary
NASA Technical Reports Server (NTRS)
Shishko, Robert
2016-01-01
HSFAM is a data model based on the DoDAF 2.02 data model with some for purpose extensions. These extensions are designed to permit quantitative analyses regarding stakeholder concerns about technical feasibility, configuration and interface issues, and budgetary and/or economic viability.
Multi-Scale Modeling to Improve Single-Molecule, Single-Cell Experiments
NASA Astrophysics Data System (ADS)
Munsky, Brian; Shepherd, Douglas
2014-03-01
Single-cell, single-molecule experiments are producing an unprecedented amount of data to capture the dynamics of biological systems. When integrated with computational models, observations of spatial, temporal and stochastic fluctuations can yield powerful quantitative insight. We concentrate on experiments that localize and count individual molecules of mRNA. These high precision experiments have large imaging and computational processing costs, and we explore how improved computational analyses can dramatically reduce overall data requirements. In particular, we show how analyses of spatial, temporal and stochastic fluctuations can significantly enhance parameter estimation results for small, noisy data sets. We also show how full probability distribution analyses can constrain parameters with far less data than bulk analyses or statistical moment closures. Finally, we discuss how a systematic modeling progression from simple to more complex analyses can reduce total computational costs by orders of magnitude. We illustrate our approach using single-molecule, spatial mRNA measurements of Interleukin 1-alpha mRNA induction in human THP1 cells following stimulation. Our approach could improve the effectiveness of single-molecule gene regulation analyses for many other process.
Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, P.; Frankel, S. H.; Adumitroaie, V.; Sabini, G.; Madnia, C. K.
1993-01-01
The primary objective of this research is to extend current capabilities of Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows. Our efforts in the first two years of this research have been concentrated on a priori investigations of single-point Probability Density Function (PDF) methods for providing subgrid closures in reacting turbulent flows. In the efforts initiated in the third year, our primary focus has been on performing actual LES by means of PDF methods. The approach is based on assumed PDF methods and we have performed extensive analysis of turbulent reacting flows by means of LES. This includes simulations of both three-dimensional (3D) isotropic compressible flows and two-dimensional reacting planar mixing layers. In addition to these LES analyses, some work is in progress to assess the extent of validity of our assumed PDF methods. This assessment is done by making detailed companions with recent laboratory data in predicting the rate of reactant conversion in parallel reacting shear flows. This report provides a summary of our achievements for the first six months of the third year of this program.
Coal gasification systems engineering and analysis, volume 2
NASA Technical Reports Server (NTRS)
1980-01-01
The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.
InSAR Scientific Computing Environment
NASA Technical Reports Server (NTRS)
Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.
2011-01-01
This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and new codes, abstraction and generalization of the data model for efficient manipulation of objects among modules, and well-designed module interfaces suitable for command- line execution or GUI-programming. The framework is designed to allow users contributions to promote maximum utility and sophistication of the code, creating an open-source community that could extend the framework into the indefinite future.
Screen time viewing behaviors and isometric trunk muscle strength in youth.
Grøntved, Anders; Ried-Larsen, Mathias; Froberg, Karsten; Wedderkopp, Niels; Brage, Søren; Kristensen, Peter Lund; Andersen, Lars Bo; Møller, Niels Christian
2013-10-01
The objective of this study was to examine the association of screen time viewing behavior with isometric trunk muscle strength in youth. A cross-sectional study was carried out including 606 adolescents (14-16 yr old) participating in the Danish European Youth Heart Study, a population-based study with assessments conducted in either 1997/1998 or 2003/2004. Maximal voluntary contractions during isometric back extension and abdominal flexion were determined using a strain gauge dynamometer, and cardiorespiratory fitness (CRF) was obtained using a maximal cycle ergometer test. TV viewing time, computer use, and other lifestyle behaviors were obtained by self-report. Analyses of association of screen use behaviors with isometric trunk muscle strength were carried out using multivariable adjusted linear regression. The mean (SD) isometric strength was 0.87 (0.16) N·kg-1. TV viewing, computer use, and total screen time use were inversely associated with isometric trunk muscle strength in analyses adjusted for lifestyle and sociodemographic factors. After further adjustment for CRF and waist circumference, associations remained significant for computer use and total screen time, but TV viewing was only marginally associated with muscle strength after these additional adjustments (-0.05 SD (95% confidence interval, -0.11 to 0.005) difference in strength per 1 h·d-1 difference in TV viewing time, P = 0.08). Each 1 h·d-1 difference in total screen time use was associated with -0.09 SD (95% confidence interval, -0.14 to -0.04) lower isometric trunk muscle strength in the fully adjusted model (P = 0.001). There were no indications that the association of screen time use with isometric trunk muscle strength was attenuated among highly fit individuals (P = 0.91 for CRF by screen time interaction). Screen time use was inversely associated with isometric trunk muscle strength independent of CRF and other confounding factors.
Transient Three-Dimensional Side Load Analysis of a Film Cooled Nozzle
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Guidos, Mike
2008-01-01
Transient three-dimensional numerical investigations on the side load physics for an engine encompassing a film cooled nozzle extension and a regeneratively cooled thrust chamber, were performed. The objectives of this study are to identify the three-dimensional side load physics and to compute the associated aerodynamic side load using an anchored computational methodology. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Ultimately, the computational results will be provided to the nozzle designers for estimating of effect of the peak side load on the nozzle structure. Computations simulating engine startup at ambient pressures corresponding to sea level and three high altitudes were performed. In addition, computations for both engine startup and shutdown transients were also performed for a stub nozzle, operating at sea level. For engine with the full nozzle extension, computational result shows starting up at sea level, the peak side load occurs when the lambda shock steps into the turbine exhaust flow, while the side load caused by the transition from free-shock separation to restricted-shock separation comes at second; and the side loads decreasing rapidly and progressively as the ambient pressure decreases. For the stub nozzle operating at sea level, the computed side loads during both startup and shutdown becomes very small due to the much reduced flow area.
Identifying novel genes and chemicals related to nasopharyngeal cancer in a heterogeneous network.
Li, Zhandong; An, Lifeng; Li, Hao; Wang, ShaoPeng; Zhou, You; Yuan, Fei; Li, Lin
2016-05-05
Nasopharyngeal cancer or nasopharyngeal carcinoma (NPC) is the most common cancer originating in the nasopharynx. The factors that induce nasopharyngeal cancer are still not clear. Additional information about the chemicals or genes related to nasopharyngeal cancer will promote a better understanding of the pathogenesis of this cancer and the factors that induce it. Thus, a computational method NPC-RGCP was proposed in this study to identify the possible relevant chemicals and genes based on the presently known chemicals and genes related to nasopharyngeal cancer. To extensively utilize the functional associations between proteins and chemicals, a heterogeneous network was constructed based on interactions of proteins and chemicals. The NPC-RGCP included two stages: the searching stage and the screening stage. The former stage is for finding new possible genes and chemicals in the heterogeneous network, while the latter stage is for screening and removing false discoveries and selecting the core genes and chemicals. As a result, five putative genes, CXCR3, IRF1, CDK1, GSTP1, and CDH2, and seven putative chemicals, iron, propionic acid, dimethyl sulfoxide, isopropanol, erythrose 4-phosphate, β-D-Fructose 6-phosphate, and flavin adenine dinucleotide, were identified by NPC-RGCP. Extensive analyses provided confirmation that the putative genes and chemicals have significant associations with nasopharyngeal cancer.
Identifying novel genes and chemicals related to nasopharyngeal cancer in a heterogeneous network
Li, Zhandong; An, Lifeng; Li, Hao; Wang, ShaoPeng; Zhou, You; Yuan, Fei; Li, Lin
2016-01-01
Nasopharyngeal cancer or nasopharyngeal carcinoma (NPC) is the most common cancer originating in the nasopharynx. The factors that induce nasopharyngeal cancer are still not clear. Additional information about the chemicals or genes related to nasopharyngeal cancer will promote a better understanding of the pathogenesis of this cancer and the factors that induce it. Thus, a computational method NPC-RGCP was proposed in this study to identify the possible relevant chemicals and genes based on the presently known chemicals and genes related to nasopharyngeal cancer. To extensively utilize the functional associations between proteins and chemicals, a heterogeneous network was constructed based on interactions of proteins and chemicals. The NPC-RGCP included two stages: the searching stage and the screening stage. The former stage is for finding new possible genes and chemicals in the heterogeneous network, while the latter stage is for screening and removing false discoveries and selecting the core genes and chemicals. As a result, five putative genes, CXCR3, IRF1, CDK1, GSTP1, and CDH2, and seven putative chemicals, iron, propionic acid, dimethyl sulfoxide, isopropanol, erythrose 4-phosphate, β-D-Fructose 6-phosphate, and flavin adenine dinucleotide, were identified by NPC-RGCP. Extensive analyses provided confirmation that the putative genes and chemicals have significant associations with nasopharyngeal cancer. PMID:27149165
Instant messaging at the hospital: supporting articulation work?
Iversen, Tobias Buschmann; Melby, Line; Toussaint, Pieter
2013-09-01
Clinical work is increasingly fragmented and requires extensive articulation and coordination. Computer systems may support such work. In this study, we investigate how instant messaging functions as a tool for supporting articulation work at the hospital. This paper aims to describe the characteristics of instant messaging communication in terms of number and length of messages, distribution over time, and the number of participants included in conversations. We also aim to determine what kind of articulation work is supported by analysing message content. Analysis of one month's worth of instant messages sent through the perioperative coordination and communication system at a Danish hospital. Instant messaging was found to be used extensively for articulation work, mostly through short, simple conversational exchanges. It is used particularly often for communication concerning the patient, specifically, the coordination and logistics of patient care. Instant messaging is used by all actors involved in the perioperative domain. Articulation work and clinical work are hard to separate in a real clinical setting. Predefined messages and strict workflow design do not suffice when supporting communication in the context of collaborative clinical work. Flexibility is of vital importance, and this needs to be reflected in the design of supportive communication systems. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Assessing natural direct and indirect effects through multiple pathways.
Lange, Theis; Rasmussen, Mette; Thygesen, Lau Caspar
2014-02-15
Within the fields of epidemiology, interventions research and social sciences researchers are often faced with the challenge of decomposing the effect of an exposure into different causal pathways working through defined mediator variables. The goal of such analyses is often to understand the mechanisms of the system or to suggest possible interventions. The case of a single mediator, thus implying only 2 causal pathways (direct and indirect) from exposure to outcome, has been extensively studied. By using the framework of counterfactual variables, researchers have established theoretical properties and developed powerful tools. However, in practical problems, it is not uncommon to have several distinct causal pathways from exposure to outcome operating through different mediators. In this article, we suggest a widely applicable approach to quantifying and ranking different causal pathways. The approach is an extension of the natural effect models proposed by Lange et al. (Am J Epidemiol. 2012;176(3):190-195). By allowing the analysis of distinct multiple pathways, the suggested approach adds to the capabilities of modern mediation techniques. Furthermore, the approach can be implemented using standard software, and we have included with this article implementation examples using R (R Foundation for Statistical Computing, Vienna, Austria) and Stata software (StataCorp LP, College Station, Texas).
Ulusoy, Nuran
2017-01-01
The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM) materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP). Two types of designs without and with 3 mm intraradicular extensions, endocrown (E) and modified endocrown (ME), were modeled on a 3D Finite element (FE) model of the MFP. Vitablocks Mark II (VMII), Vita Enamic (VE), and Lava Ultimate (LU) CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU. PMID:29119108
Computer Model Helps Communities Gauge Effects of New Industry.
ERIC Educational Resources Information Center
Long, Celeste; And Others
1987-01-01
Describes computer Industrial Impact Model used by Texas Agricultural Extension Service rural planners to assess potential benefits and costs of new firms on community private and public sectors. Presents selected data/results for two communities assessing impact of the same plant. (NEC)
78 FR 57839 - Request for Information on Computer Security Incident Coordination (CSIC)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
... Institute of Standards and Technology (NIST), United States Department of Commerce. ACTION: Notice, extension of comment period. SUMMARY: NIST is extending the deadline for submitting comments relating to Computer Security Incident Coordination. NIST experienced technical difficulties with receiving email...
Ecological Footprint Analysis (EFA) for the Chicago Metropolitan Area: Initial Estimation - slides
Because of its computational simplicity, Ecological Footprint Analysis (EFA) has been extensively deployed for assessing the sustainability of various environmental systems. In general, EFA aims at capturing the impacts of human activity on the environment by computing the amount...
One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...
Barriers and Incentives to Computer Usage in Teaching
1988-09-29
classes with one or two computers. Research Methods The two major methods of data-gathering employed in this study were intensive and extensive classroom ... observation and repeated extended interviews with students and teachers. Administrators were also interviewed when appropriate. Classroom observers used
Vital nodes identification in complex networks
NASA Astrophysics Data System (ADS)
Lü, Linyuan; Chen, Duanbing; Ren, Xiao-Long; Zhang, Qian-Ming; Zhang, Yi-Cheng; Zhou, Tao
2016-09-01
Real networks exhibit heterogeneous nature with nodes playing far different roles in structure and function. To identify vital nodes is thus very significant, allowing us to control the outbreak of epidemics, to conduct advertisements for e-commercial products, to predict popular scientific publications, and so on. The vital nodes identification attracts increasing attentions from both computer science and physical societies, with algorithms ranging from simply counting the immediate neighbors to complicated machine learning and message passing approaches. In this review, we clarify the concepts and metrics, classify the problems and methods, as well as review the important progresses and describe the state of the art. Furthermore, we provide extensive empirical analyses to compare well-known methods on disparate real networks, and highlight the future directions. In spite of the emphasis on physics-rooted approaches, the unification of the language and comparison with cross-domain methods would trigger interdisciplinary solutions in the near future.
RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.
Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab
2012-01-01
RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.
Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.
Fang, Cheng; Xiao, Zhiyan
2016-01-01
Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
Gómez-Sánchez, Elena; Soriano, Elena; Marco-Contelles, José
2008-09-05
The synthesis and reactivity of the 7-azabicyclo[2.2.1]hept-2-yl radical has been extensively investigated in inter- and intramolecular reaction processes for the first time. In this work we will present the preparation of the radical and its successful intermolecular reaction with radical acceptors such as tert-butylisocyanide and acrylonitrile. Computational analyses have been carried out to show and explain the mechanisms and stereochemical outcome of these transformations. Overall and from the chemical point of view, a new and convenient synthetic approach has been developed for the synthesis of exo-2-(cyano)alkyl substituted 7-azabicyclo[2.2.1]heptane derivatives, a series of compounds of wide interest for the synthesis of heterocyclic analogues of epibatidine. As a result, we describe here the synthesis of the tetrazoloepibatidines (8 and 15) and the oxadiazoloepibatidine (10).
MetaSort untangles metagenome assembly by reducing microbial community complexity
Ji, Peifeng; Zhang, Yanming; Wang, Jinfeng; Zhao, Fangqing
2017-01-01
Most current approaches to analyse metagenomic data rely on reference genomes. Novel microbial communities extend far beyond the coverage of reference databases and de novo metagenome assembly from complex microbial communities remains a great challenge. Here we present a novel experimental and bioinformatic framework, metaSort, for effective construction of bacterial genomes from metagenomic samples. MetaSort provides a sorted mini-metagenome approach based on flow cytometry and single-cell sequencing methodologies, and employs new computational algorithms to efficiently recover high-quality genomes from the sorted mini-metagenome by the complementary of the original metagenome. Through extensive evaluations, we demonstrated that metaSort has an excellent and unbiased performance on genome recovery and assembly. Furthermore, we applied metaSort to an unexplored microflora colonized on the surface of marine kelp and successfully recovered 75 high-quality genomes at one time. This approach will greatly improve access to microbial genomes from complex or novel communities. PMID:28112173
Potential evapotranspiration and the likelihood of future drought
NASA Technical Reports Server (NTRS)
Rind, D.; Hansen, J.; Goldberg, R.; Rosenzweig, C.; Ruedy, R.
1990-01-01
The possibility that the greenhouse warming predicted by the GISS general-circulation model and other GCMs could lead to severe droughts is investigated by means of numerical simulations, with a focus on the role of potential evapotranspiration E(P). The relationships between precipitation (P), E(P), soil moisture, and vegetation changes in GCMs are discussed; the empirically derived Palmer drought-intensity index and a new supply-demand index (SDDI) based on changes in P - E(P) are described; and simulation results for the period 1960-2060 are presented in extensive tables, graphs, and computer-generated color maps. Simulations with both drought indices predict increasing drought frequency for the U.S., with effects already apparent in the 1990s and a 50-percent frequency of severe droughts by the 2050s. Analyses of arid periods during the Mesozoic and Cenozoic are shown to support the use of the SDDI in GCM drought prediction.
Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2015-01-01
This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.
The value of item response theory in clinical assessment: a review.
Thomas, Michael L
2011-09-01
Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical assessment are reviewed to appraise its current and potential value. Benefits of IRT include comprehensive analyses and reduction of measurement error, creation of computer adaptive tests, meaningful scaling of latent variables, objective calibration and equating, evaluation of test and item bias, greater accuracy in the assessment of change due to therapeutic intervention, and evaluation of model and person fit. The theory may soon reinvent the manner in which tests are selected, developed, and scored. Although challenges remain to the widespread implementation of IRT, its application to clinical assessment holds great promise. Recommendations for research, test development, and clinical practice are provided.
Developing advanced X-ray scattering methods combined with crystallography and computation.
Perry, J Jefferson P; Tainer, John A
2013-03-01
The extensive use of small angle X-ray scattering (SAXS) over the last few years is rapidly providing new insights into protein interactions, complex formation and conformational states in solution. This SAXS methodology allows for detailed biophysical quantification of samples of interest. Initial analyses provide a judgment of sample quality, revealing the potential presence of aggregation, the overall extent of folding or disorder, the radius of gyration, maximum particle dimensions and oligomerization state. Structural characterizations include ab initio approaches from SAXS data alone, and when combined with previously determined crystal/NMR, atomistic modeling can further enhance structural solutions and assess validity. This combination can provide definitions of architectures, spatial organizations of protein domains within a complex, including those not determined by crystallography or NMR, as well as defining key conformational states of a protein interaction. SAXS is not generally constrained by macromolecule size, and the rapid collection of data in a 96-well plate format provides methods to screen sample conditions. This includes screening for co-factors, substrates, differing protein or nucleotide partners or small molecule inhibitors, to more fully characterize the variations within assembly states and key conformational changes. Such analyses may be useful for screening constructs and conditions to determine those most likely to promote crystal growth of a complex under study. Moreover, these high throughput structural determinations can be leveraged to define how polymorphisms affect assembly formations and activities. This is in addition to potentially providing architectural characterizations of complexes and interactions for systems biology-based research, and distinctions in assemblies and interactions in comparative genomics. Thus, SAXS combined with crystallography/NMR and computation provides a unique set of tools that should be considered as being part of one's repertoire of biophysical analyses, when conducting characterizations of protein and other macromolecular interactions. Copyright © 2013 Elsevier Inc. All rights reserved.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K
2011-04-01
Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual measurements, while interobserver APAs ranged from 91% to 96% for QMA versus 57% to 63% for digitized manual measurements. The use of QMA software substantially improved the reliability of lumbar intervertebral measurements and the classification of instability based on flexion-extension radiographs.
Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria
ERIC Educational Resources Information Center
Ofemile, Abdulmalik Yusuf
2015-01-01
This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…
REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang
2013-04-30
Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less
Robust efficient video fingerprinting
NASA Astrophysics Data System (ADS)
Puri, Manika; Lubin, Jeffrey
2009-02-01
We have developed a video fingerprinting system with robustness and efficiency as the primary and secondary design criteria. In extensive testing, the system has shown robustness to cropping, letter-boxing, sub-titling, blur, drastic compression, frame rate changes, size changes and color changes, as well as to the geometric distortions often associated with camcorder capture in cinema settings. Efficiency is afforded by a novel two-stage detection process in which a fast matching process first computes a number of likely candidates, which are then passed to a second slower process that computes the overall best match with minimal false alarm probability. One key component of the algorithm is a maximally stable volume computation - a three-dimensional generalization of maximally stable extremal regions - that provides a content-centric coordinate system for subsequent hash function computation, independent of any affine transformation or extensive cropping. Other key features include an efficient bin-based polling strategy for initial candidate selection, and a final SIFT feature-based computation for final verification. We describe the algorithm and its performance, and then discuss additional modifications that can provide further improvement to efficiency and accuracy.
Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes
NASA Technical Reports Server (NTRS)
Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.
1986-01-01
SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.
A computer graphics program for general finite element analyses
NASA Technical Reports Server (NTRS)
Thornton, E. A.; Sawyer, L. M.
1978-01-01
Documentation for a computer graphics program for displays from general finite element analyses is presented. A general description of display options and detailed user instructions are given. Several plots made in structural, thermal and fluid finite element analyses are included to illustrate program options. Sample data files are given to illustrate use of the program.
Transportation elements assessment : Town of Milton, September 15, 2009.
DOT National Transportation Integrated Search
2009-09-15
During the summer of 2009, the Delaware T2 Center collected extensive data and completed analyses related to transportation infrastructure in the Town of Milton, Delaware. This report presents those data, the analyses, and resulting recommendations.
Transportation elements assessment : Town of Milton, November 2, 2009.
DOT National Transportation Integrated Search
2010-11-02
During the summer of 2009, the Delaware T2 Center collected extensive data and completed analyses related to transportation infrastructure in the Town of Milton, Delaware. This report presents those data, the analyses, and resulting recommendations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Natzkov, S.; Nikolov, M.
1995-12-01
The design life of the main power equipment-boilers and turbines is about 105 working hours. The possibilities for life extension are after normatively regulated control tests. The diagnostics and methodology for Boilers and Turbines Elements Remaining Life Assessment using up to date computer programs, destructive and nondestructive control of metal of key elements of units equipment, metal creep and low cycle fatigue calculations. As well as data for most common damages and some technical decisions for elements life extension are presented.
DOT National Transportation Integrated Search
1978-09-01
This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...
Assessment of (Computer-Supported) Collaborative Learning
ERIC Educational Resources Information Center
Strijbos, J. -W.
2011-01-01
Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…
48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals
Code of Federal Regulations, 2012 CFR
2012-10-01
... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...
48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals
Code of Federal Regulations, 2011 CFR
2011-10-01
... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...
48 CFR Appendix A to Chapter 2 - Armed Services Board of Contract Appeals
Code of Federal Regulations, 2013 CFR
2013-10-01
... EXTENSIONS Rule 33Time, Computation and Extensions EX PARTE COMMUNICATIONS Rule 34Ex parte Communications..., taking into account such factors as the size and complexity of the claim, the contractor may file a... exhibits, post-hearing briefs, and documents which the Board has specifically designated to be made a part...
Extending the Binomial Checkpointing Technique for Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Andrea; Narayanan, Sri Hari Krishna
In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, re- quired, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algo- rithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massivemore » parallel simulations and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We de- scribe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding imple- mentation and discuss numerical results.« less
NASA Astrophysics Data System (ADS)
Fukushima, Toshio
2012-04-01
By extending the exponent of floating point numbers with an additional integer as the power index of a large radix, we compute fully normalized associated Legendre functions (ALF) by recursion without underflow problem. The new method enables us to evaluate ALFs of extremely high degree as 232 = 4,294,967,296, which corresponds to around 1 cm resolution on the Earth's surface. By limiting the application of exponent extension to a few working variables in the recursion, choosing a suitable large power of 2 as the radix, and embedding the contents of the basic arithmetic procedure of floating point numbers with the exponent extension directly in the program computing the recurrence formulas, we achieve the evaluation of ALFs in the double-precision environment at the cost of around 10% increase in computational time per single ALF. This formulation realizes meaningful execution of the spherical harmonic synthesis and/or analysis of arbitrary degree and order.
NASA Technical Reports Server (NTRS)
Jones, A. L.
1972-01-01
Requirements and concepts and the tradeoff analysis leading to the preferred concept are presented. Integrated analyses are given for subsystems and thermal control. Specific tradeoffs and analyses are also given for water management, atmosphere control, energy storage, radiators, navigation, control moment gyros, and system maintenance. The analyses of manipulator concepts and requirements, and supplemental analyses of information management issues are summarized. Subsystem reliability analyses include a detailed discussion of the critical failure analysis.
Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T
2012-03-01
This study evaluated the use of simple inclines as a portable peripheral for improving head and neck postures during notebook computer use on tables in portable environments such as hotel rooms, cafés, and airport lounges. A 3D motion analysis system measured head, neck and right upper extremity postures of 15 participants as they completed a 10 min computer task in six different configurations, all on a fixed height desk: no-incline, 12° incline, 25° incline, no-incline with external mouse, 25° incline with an external mouse, and a commercially available riser with external mouse and keyboard. After completion of the task, subjects rated the configuration for comfort and ease of use and indicated perceived discomfort in several body segments. Compared to the no-incline configuration, use of the 12° incline reduced forward head tilt and neck flexion while increasing wrist extension. The 25° incline further reduced head tilt and neck flexion while further increasing wrist extension. The 25° incline received the lowest comfort and ease of use ratings and the highest perceived discomfort score. For portable, temporary computing environments where internal input devices are used, users may find improved head and neck postures with acceptable wrist extension postures with the utilization of a 12° incline. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Tomo3D 2.0--exploitation of advanced vector extensions (AVX) for 3D reconstruction.
Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus
2015-02-01
Tomo3D is a program for fast tomographic reconstruction on multicore computers. Its high speed stems from code optimization, vectorization with Streaming SIMD Extensions (SSE), multithreading and optimization of disk access. Recently, Advanced Vector eXtensions (AVX) have been introduced in the x86 processor architecture. Compared to SSE, AVX double the number of simultaneous operations, thus pointing to a potential twofold gain in speed. However, in practice, achieving this potential is extremely difficult. Here, we provide a technical description and an assessment of the optimizations included in Tomo3D to take advantage of AVX instructions. Tomo3D 2.0 allows huge reconstructions to be calculated in standard computers in a matter of minutes. Thus, it will be a valuable tool for electron tomography studies with increasing resolution needs. Copyright © 2014 Elsevier Inc. All rights reserved.
Adaptive DIT-Based Fringe Tracking and Prediction at IOTA
NASA Technical Reports Server (NTRS)
Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.
2004-01-01
An automatic fringe tracking system has been developed and implemented at the Infrared Optical Telescope Array (IOTA). In testing during May 2002, the system successfully minimized the optical path differences (OPDs) for all three baselines at IOTA. Based on sliding window discrete Fourier transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on off-line data. Implemented in ANSI C on the 266 MHZ PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately 2.0 milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. Preliminary analysis on an extension of this algorithm indicates a potential for predictive tracking, although at present, real-time implementation of this extension would require significantly more computational capacity.
Rosenberg, David M; Horn, Charles C
2016-08-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.
2016-01-01
Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus—a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software—an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025
Wedge sampling for computing clustering coefficients and triangle counts on large graphs
Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.
2014-05-08
Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less
Extense historical droughts in Spain derived from documentary sources
NASA Astrophysics Data System (ADS)
Dominguez-Castro, F.; García-Herrera, R.; Barriendos, M.
2009-09-01
Documentary records, specially those from rogation ceremonies have been extensively used to build proxy series of droughts and floods in Spain. Most of the work done previously has focused in the abstraction of the documents and building of the individual series, but less attention has been paid to the joint analysis of this type of records. This is problematic because, due to the diversity of Spanish climates, the climatological meaning of the rogation ceremonies changes depending on the considered region. This paper aims to analyse the spatial extension of drought events from the rogation records from Barcelona, Bilbao, Gerona, Murcia, Seville, Tarragona, Toledo, Tortosa and Zamora, which cover the 16th to 19th centuries. The representativeness of each of them is analysed taking into account the local climate and the series variability. Then the spatial scale of the recorded droughts is examined at seasonal scale. The results show high multidecadal variability, with the driest periods at national scale recorded during the 1680s, 1730s and 1780s. Finally, the dry years of 1680, 1683 and 1817 are analysed in detail.
InSAR Scientific Computing Environment
NASA Astrophysics Data System (ADS)
Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.
2010-12-01
The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.
The Next Frontier in Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarrao, John
2016-11-16
Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.
Transportation elements assessment : Town of Milton, Delaware, September 29, 2009.
DOT National Transportation Integrated Search
2009-09-29
During the summer of 2009, the Delaware T2 Center collected extensive data : and completed analyses related to transportation infrastructure in the Town : of Milton, Delaware. This report presents those data, the analyses, and : resulting recommendat...
A glacier runoff extension to the Precipitation Runoff Modeling System
Van Beusekom, Ashley E.; Viger, Roland
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while maintaining model usability. PRMSglacier is validated on two basins in Alaska, Wolverine, and Gulkana Glacier basin, which have been studied since 1966 and have a substantial amount of data with which to test model performance over a long period of time covering a wide range of climatic and hydrologic conditions. When error in field measurements is considered, the Nash-Sutcliffe efficiencies of streamflow are 0.87 and 0.86, the absolute bias fractions of the winter mass balance simulations are 0.10 and 0.08, and the absolute bias fractions of the summer mass balances are 0.01 and 0.03, all computed over 42 years for the Wolverine and Gulkana Glacier basins, respectively. Without taking into account measurement error, the values are still within the range achieved by the more computationally expensive codes tested over shorter time periods.
An Investigation of High-Order Shock-Capturing Methods for Computational Aeroacoustics
NASA Technical Reports Server (NTRS)
Casper, Jay; Baysal, Oktay
1997-01-01
Topics covered include: Low-dispersion scheme for nonlinear acoustic waves in nonuniform flow; Computation of acoustic scattering by a low-dispersion scheme; Algorithmic extension of low-dispersion scheme and modeling effects for acoustic wave simulation; The accuracy of shock capturing in two spatial dimensions; Using high-order methods on lower-order geometries; and Computational considerations for the simulation of discontinuous flows.
ERIC Educational Resources Information Center
Lavender, Julie
2013-01-01
Military health care facilities make extensive use of computer-based training (CBT) for both clinical and non-clinical staff. Despite evidence identifying various factors that may impact CBT, the problem is unclear as to what factors specifically influence employee participation in computer-based training. The purpose of this mixed method case…
PETSc Users Manual Revision 3.7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, Satish; Abhyankar, S.; Adams, M.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.
PETSc Users Manual Revision 3.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Abhyankar, S.; Adams, M.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.
MIADS2 ... an alphanumeric map information assembly and display system for a large computer
Elliot L. Amidon
1966-01-01
A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...
NASTRAN computer system level 12.1
NASA Technical Reports Server (NTRS)
Butler, T. G.
1971-01-01
Program uses finite element displacement method for solving linear response of large, three-dimensional structures subject to static, dynamic, thermal, and random loadings. Program adapts to computers of different manufacture, permits up-dating and extention, allows interchange of output and input information between users, and is extensively documented.
ERIC Educational Resources Information Center
Campbell, Joseph K.
1979-01-01
Describes New York State's extension experience in using the programable calculator, a portable pocket-size computer, to solve many of the problems that central computers now handle. Subscription services to programs written for the Texas Instruments TI-59 programable calculator are provided by both Cornell and Iowa State Universities. (MF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowan, R. D.; Rajnak, K.; Renard, P.
This is a set of three Fortran IV programs, RCN29, HFMOD7, and RCN229, based on the Herman--Skillman and Charlotte Froese Fischer programs, with extensive modifications and additions. The programs compute self-consistent-field radial wave functions and the various radial integrals involved in the computation of atomic energy levels and spectra.
Determinants of Computer Utilization by Extension Personnel: A Structural Equations Approach
ERIC Educational Resources Information Center
Sivakumar, Paramasivan Sethuraman; Parasar, Bibudha; Das, Raghu Nath; Anantharaman, Mathevanpillai
2014-01-01
Purpose: Information technology (IT) has tremendous potential for fostering grassroots development and the Indian government has created various capital-intensive computer networks to promote agricultural development. However, research studies have shown that information technology investments are not always translated into productivity gains due…
Introduction to SmartBooks. Report 23-93.
ERIC Educational Resources Information Center
Kopec, Danny; Wood, Carol
Humankind has become accustomed to reading and learning from printed books. The computer offers us the possibility to exploit another medium whose key advantage is flexibility through extensive memory, computational speed, and versatile representational means. Specifically, we have the hypercard application, an integrated piece of software, with…
Calendar Instruments in Retrospective Web Surveys
ERIC Educational Resources Information Center
Glasner, Tina; van der Vaart, Wander; Dijkstra, Wil
2015-01-01
Calendar instruments incorporate aided recall techniques such as temporal landmarks and visual time lines that aim to reduce response error in retrospective surveys. Those calendar instruments have been used extensively in off-line research (e.g., computer-aided telephone interviews, computer assisted personal interviewing, and paper and pen…
Testing Extension Services through AKAP Models
ERIC Educational Resources Information Center
De Rosa, Marcello; Bartoli, Luca; La Rocca, Giuseppe
2014-01-01
Purpose: The aim of the paper is to analyse the attitude of Italian farms in gaining access to agricultural extension services (AES). Design/methodology/approach: The ways Italian farms use AES are described through the AKAP (Awareness, Knowledge, Adoption, Product) sequence. This article investigated the AKAP sequence by submitting a…
Smisc - A collection of miscellaneous functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landon Sego, PNNL
2015-08-31
A collection of functions for statistical computing and data manipulation. These include routines for rapidly aggregating heterogeneous matrices, manipulating file names, loading R objects, sourcing multiple R files, formatting datetimes, multi-core parallel computing, stream editing, specialized plotting, etc. Smisc-package A collection of miscellaneous functions allMissing Identifies missing rows or columns in a data frame or matrix as.numericSilent Silent wrapper for coercing a vector to numeric comboList Produces all possible combinations of a set of linear model predictors cumMax Computes the maximum of the vector up to the current index cumsumNA Computes the cummulative sum of a vector without propogating NAsmore » d2binom Probability functions for the sum of two independent binomials dataIn A flexible way to import data into R. dbb The Beta-Binomial Distribution df2list Row-wise conversion of a data frame to a list dfplapply Parallelized single row processing of a data frame dframeEquiv Examines the equivalence of two dataframes or matrices dkbinom Probability functions for the sum of k independent binomials factor2character Converts all factor variables in a dataframe to character variables findDepMat Identify linearly dependent rows or columns in a matrix formatDT Converts date or datetime strings into alternate formats getExtension Filename manipulations: remove the extension or path, extract the extension or path getPath Filename manipulations: remove the extension or path, extract the extension or path grabLast Filename manipulations: remove the extension or path, extract the extension or path ifelse1 Non-vectorized version of ifelse integ Simple numerical integration routine interactionPlot Two-way Interaction Plot with Error Bar linearMap Linear mapping of a numerical vector or scalar list2df Convert a list to a data frame loadObject Loads and returns the object(s) in an ".Rdata" file more Display the contents of a file to the R terminal movAvg2 Calculate the moving average using a 2-sided window openDevice Opens a graphics device based on the filename extension p2binom Probability functions for the sum of two independent binomials padZero Pad a vector of numbers with zeros parseJob Parses a collection of elements into (almost) equal sized groups pbb The Beta-Binomial Distribution pcbinom A continuous version of the binomial cdf pkbinom Probability functions for the sum of k independent binomials plapply Simple parallelization of lapply plotFun Plot one or more functions on a single plot PowerData An example of power data pvar Prints the name and value of one or more objects qbb The Beta-Binomial Distribution rbb And numerous others (space limits reporting).« less
NASA Technical Reports Server (NTRS)
Liechty, Derek S.; Lewis, Mark J.
2010-01-01
Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.
Visual Environments for CFD Research
NASA Technical Reports Server (NTRS)
Watson, Val; George, Michael W. (Technical Monitor)
1994-01-01
This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.
NASA Astrophysics Data System (ADS)
Selle, B.; Schwientek, M.
2012-04-01
Water quality of ground and surface waters in catchments is typically driven by many complex and interacting processes. While small scale processes are often studied in great detail, their relevance and interplay at catchment scales remain often poorly understood. For many catchments, extensive monitoring data on water quality have been collected for different purposes. These heterogeneous data sets contain valuable information on catchment scale processes but are rarely analysed using integrated methods. Principle component analysis (PCA) has previously been applied to this kind of data sets. However, a detailed analysis of scores, which are an important result of a PCA, is often missing. Mathematically, PCA expresses measured variables on water quality, e.g. nitrate concentrations, as linear combination of independent, not directly observable key processes. These computed key processes are represented by principle components. Their scores are interpretable as process intensities which vary in space and time. Subsequently, scores can be correlated with other key variables and catchment characteristics, such as water travel times and land use that were not considered in PCA. This detailed analysis of scores represents an extension of the commonly applied PCA which could considerably improve the understanding of processes governing water quality at catchment scales. In this study, we investigated the 170 km2 Ammer catchment in SW Germany which is characterised by an above average proportion of agricultural (71%) and urban (17%) areas. The Ammer River is mainly fed by karstic springs. For PCA, we separately analysed concentrations from (a) surface waters of the Ammer River and its tributaries, (b) spring waters from the main aquifers and (c) deep groundwater from production wells. This analysis was extended by a detailed analysis of scores. We analysed measured concentrations on major ions and selected organic micropollutants. Additionally, redox-sensitive variables and environmental tracers indicating groundwater age were analysed for deep groundwater from production wells. For deep groundwater, we found that microbial turnover was stronger influenced by local availability of energy sources than by travel times of groundwater to the wells. Groundwater quality primarily reflected the input of pollutants determined by landuse, e.g. agrochemicals. We concluded that for water quality in the Ammer catchment, conservative mixing of waters with different origin is more important than reactive transport processes along the flow path.
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Perry, Boyd, III; Florance, James R.; Sanetrik, Mark D.; Wieseman, Carol D.; Stevens, William L.; Funk, Christie J.; Hur, Jiyoung; Christhilf, David M.; Coulson, David A.
2011-01-01
A summary of computational and experimental aeroelastic and aeroservoelastic (ASE) results for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analyses and multiple ASE wind-tunnel tests of the S4T have been performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The computational results to be presented include linear aeroelastic and ASE analyses, nonlinear aeroelastic analyses using an aeroelastic CFD code, and rapid aeroelastic analyses using CFD-based reduced-order models (ROMs). Experimental results from two closed-loop wind-tunnel tests performed at NASA Langley's Transonic Dynamics Tunnel (TDT) will be presented as well.
ERIC Educational Resources Information Center
Vangsnes, Vigdis; Gram Okland, Nils Tore; Krumsvik, Rune
2012-01-01
This article focuses on the didactical implications when commercial educational computer games are used in Norwegian kindergartens by analysing the dramaturgy and the didactics of one particular game and the game in use in a pedagogical context. Our justification for analysing the game by using dramaturgic theory is that we consider the game to be…
Computational Aeroelastic Analyses of a Low-Boom Supersonic Configuration
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Connolly, Joseph
2015-01-01
An overview of NASA's Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) element is provided with a focus on recent computational aeroelastic analyses of a low-boom supersonic configuration developed by Lockheed-Martin and referred to as the N+2 configuration. The overview includes details of the computational models developed to date including a linear finite element model (FEM), linear unsteady aerodynamic models, unstructured CFD grids, and CFD-based aeroelastic analyses. In addition, a summary of the work involving the development of aeroelastic reduced-order models (ROMs) and the development of an aero-propulso-servo-elastic (APSE) model is provided.
Analyses of ACPL thermal/fluid conditioning system
NASA Technical Reports Server (NTRS)
Stephen, L. A.; Usher, L. H.
1976-01-01
Results of engineering analyses are reported. Initial computations were made using a modified control transfer function where the systems performance was characterized parametrically using an analytical model. The analytical model was revised to represent the latest expansion chamber fluid manifold design, and systems performance predictions were made. Parameters which were independently varied in these computations are listed. Systems predictions which were used to characterize performance are primarily transient computer plots comparing the deviation between average chamber temperature and the chamber temperature requirement. Additional computer plots were prepared. Results of parametric computations with the latest fluid manifold design are included.
[Results of the marketing research study "Acceptance of physician's office computer systems"].
Steinhausen, D; Brinkmann, F; Engelhard, A
1998-01-01
We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.
Visual management support system
Lee Anderson; Jerry Mosier; Geoffrey Chandler
1979-01-01
The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...
ERIC Educational Resources Information Center
Prosser, Andrew
2014-01-01
Digital storytelling is already used extensively in language education. Web documentaries, particularly in terms of design and narrative structure, provide an extension of the digital storytelling concept, specifically in terms of increased interactivity. Using a model of interactive, non-linear storytelling, originally derived from computer game…
The Next Frontier in Computing
Sarrao, John
2018-06-13
Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of todayâs most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.
Costa, Michelle N; Radhakrishnan, Krishnan; Wilson, Bridget S; Vlachos, Dionisios G; Edwards, Jeremy S
2009-07-23
The ErbB family of receptors activates intracellular signaling pathways that control cellular proliferation, growth, differentiation and apoptosis. Given these central roles, it is not surprising that overexpression of the ErbB receptors is often associated with carcinogenesis. Therefore, extensive laboratory studies have been devoted to understanding the signaling events associated with ErbB activation. Systems biology has contributed significantly to our current understanding of ErbB signaling networks. However, although computational models have grown in complexity over the years, little work has been done to consider the spatial-temporal dynamics of receptor interactions and to evaluate how spatial organization of membrane receptors influences signaling transduction. Herein, we explore the impact of spatial organization of the epidermal growth factor receptor (ErbB1/EGFR) on the initiation of downstream signaling. We describe the development of an algorithm that couples a spatial stochastic model of membrane receptors with a nonspatial stochastic model of the reactions and interactions in the cytosol. This novel algorithm provides a computationally efficient method to evaluate the effects of spatial heterogeneity on the coupling of receptors to cytosolic signaling partners. Mathematical models of signal transduction rarely consider the contributions of spatial organization due to high computational costs. A hybrid stochastic approach simplifies analyses of the spatio-temporal aspects of cell signaling and, as an example, demonstrates that receptor clustering contributes significantly to the efficiency of signal propagation from ligand-engaged growth factor receptors.
Aerodynamic shape optimization directed toward a supersonic transport using sensitivity analysis
NASA Technical Reports Server (NTRS)
Baysal, Oktay
1995-01-01
This investigation was conducted from March 1994 to August 1995, primarily, to extend and implement the previously developed aerodynamic design optimization methodologies for the problems related to a supersonic transport design. These methods had demonstrated promise to improve the designs (more specifically, the shape) of aerodynamic surfaces, by coupling optimization algorithms (OA) with Computational Fluid Dynamics (CFD) algorithms via sensitivity analyses (SA) with surface definition methods from Computer Aided Design (CAD). The present extensions of this method and their supersonic implementations have produced wing section designs, delta wing designs, cranked-delta wing designs, and nacelle designs, all of which have been reported in the open literature. Despite the fact that these configurations were highly simplified to be of any practical or commercial use, they served the algorithmic and proof-of-concept objectives of the study very well. The primary cause for the configurational simplifications, other than the usual simplify-to-study the fundamentals reason, were the premature closing of the project. Only after the first of the originally intended three-year term, both the funds and the computer resources supporting the project were abruptly cut due to their severe shortages at the funding agency. Nonetheless, it was shown that the extended methodologies could be viable options in optimizing the design of not only an isolated single-component configuration, but also a multiple-component configuration in supersonic and viscous flow. This allowed designing with the mutual interference of the components being one of the constraints all along the evolution of the shapes.
Answer Set Programming and Other Computing Paradigms
ERIC Educational Resources Information Center
Meng, Yunsong
2013-01-01
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…
Improved Adjoint-Operator Learning For A Neural Network
NASA Technical Reports Server (NTRS)
Toomarian, Nikzad; Barhen, Jacob
1995-01-01
Improved method of adjoint-operator learning reduces amount of computation and associated computational memory needed to make electronic neural network learn temporally varying pattern (e.g., to recognize moving object in image) in real time. Method extension of method described in "Adjoint-Operator Learning for a Neural Network" (NPO-18352).
Elliptic Curve Cryptography with Java
ERIC Educational Resources Information Center
Klima, Richard E.; Sigmon, Neil P.
2005-01-01
The use of the computer, and specifically the mathematics software package Maple, has played a central role in the authors' abstract algebra course because it provides their students with a way to see realistic examples of the topics they discuss without having to struggle with extensive computations. However, Maple does not provide the computer…
Initiating a Programmatic Assessment Report
ERIC Educational Resources Information Center
Berkaliev, Zaur; Devi, Shavila; Fasshauer, Gregory E.; Hickernell, Fred J.; Kartal, Ozgul; Li, Xiaofan; McCray, Patrick; Whitney, Stephanie; Zawojewski, Judith S.
2014-01-01
In the context of a department of applied mathematics, a program assessment was conducted to assess the departmental goal of enabling undergraduate students to recognize, appreciate, and apply the power of computational tools in solving mathematical problems that cannot be solved by hand, or would require extensive and tedious hand computation. A…
Quantum Computer Games: Quantum Minesweeper
ERIC Educational Resources Information Center
Gordon, Michal; Gordon, Goren
2010-01-01
The computer game of quantum minesweeper is introduced as a quantum extension of the well-known classical minesweeper. Its main objective is to teach the unique concepts of quantum mechanics in a fun way. Quantum minesweeper demonstrates the effects of superposition, entanglement and their non-local characteristics. While in the classical…
A Flexible, Extensible Online Testing System for Mathematics
ERIC Educational Resources Information Center
Passmore, Tim; Brookshaw, Leigh; Butler, Harry
2011-01-01
An online testing system developed for entry-skills testing of first-year university students in algebra and calculus is described. The system combines the open-source computer algebra system "Maxima" with computer scripts to parse student answers, which are entered using standard mathematical notation and conventions. The answers can…
Educational Research and Theory Perspectives on Intelligent Computer-Assisted Instruction.
ERIC Educational Resources Information Center
Tennyson, Robert D.; Christensen, Dean L.
This paper defines the next generation of intelligent computer-assisted instructional systems (ICAI) by depicting the elaborations and extensions offered by educational research and theory perspectives to enhance the ICAI environment. The first section describes conventional ICAI systems, which use expert systems methods and have three modules: a…
Quantum Computer Games: Schrodinger Cat and Hounds
ERIC Educational Resources Information Center
Gordon, Michal; Gordon, Goren
2012-01-01
The quantum computer game "Schrodinger cat and hounds" is the quantum extension of the well-known classical game fox and hounds. Its main objective is to teach the unique concepts of quantum mechanics in a fun way. "Schrodinger cat and hounds" demonstrates the effects of superposition, destructive and constructive interference, measurements and…
Primary School Pupils' Attitudes toward Learning Programming through Visual Interactive Environments
ERIC Educational Resources Information Center
Asad, Khaled; Tibi, Moanis; Raiyn, Jamal
2016-01-01
New generations are using and playing with mobile and computer applications extensively. These applications are the outcomes of programming work that involves skills, such as computational and algorithmic thinking. Learning programming is not easy for students children. In recent years, academic institutions like the Massachusetts Institute of…
Empirical Validation and Application of the Computing Attitudes Survey
ERIC Educational Resources Information Center
Dorn, Brian; Elliott Tew, Allison
2015-01-01
Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schoeni, Anna, E-mail: anna.schoeni@unibas.ch
Background: We investigated whether radiofrequency electromagnetic fields (RF-EMF) from mobile phones and other wireless devices or by the wireless device use itself due to non-radiation related factors in that context are associated with an increase in health symptom reports of adolescents in Central Switzerland. Methods: In a prospective cohort study, 439 study participants (participation rate: 36.8%) aged 12–17 years, completed questionnaires about their mobile and cordless phone use, their self-reported symptoms and possible confounding factors at baseline (2012/2013) and one year later (2013/2014). Operator recorded mobile phone data was obtained for a subgroup of 234 adolescents. RF-EMF dose measures consideringmore » various factors affecting RF-EMF exposure were computed for the brain and the whole body. Data were analysed using a mixed-logistic cross-sectional model and a cohort approach, where we investigated whether cumulative dose over one year was related to a new onset of a symptom between baseline and follow-up. All analyses were adjusted for relevant confounders. Results: Participation rate in the follow-up was 97% (425 participants). In both analyses, cross-sectional and cohort, various symptoms tended to be mostly associated with usage measures that are only marginally related to RF-EMF exposure such as the number of text messages sent per day (e.g. tiredness: OR:1.81; 95%CI:1.20–2.74 for cross-sectional analyses and OR:1.87; 95%CI:1.04–3.38 for cohort analyses). Outcomes were generally less strongly or not associated with mobile phone call duration and RF-EMF dose measures. Conclusions: Stronger associations between symptoms of ill health and wireless communication device use than for RF-EMF dose measures were observed. Such a result pattern does not support a causal association between RF-EMF exposure and health symptoms of adolescents but rather suggests that other aspects of extensive media use are related to symptoms. - Highlights: • This is a prospective cohort study with approximately one year of follow-up. • Self-reported and operator recorded mobile phone use data were collected. • The cumulative RF-EMF dose for the brain and for the whole body was calculated. • Associations were stronger for the use of wireless devices than for RF-EMF dose. • This suggests that rather aspects of extensive media use than RF-EMF are related to symptoms.« less
ERIC Educational Resources Information Center
Oregon Univ., Eugene. Center for Advanced Technology in Education.
The 13 conference presentations in this proceedings are arranged by general and special interest sessions and listed within each session in the order in which they were presented. These papers are: (1) "Key Issues for the Near Future" (David Moursund); (2) "Educating with Computers: Insights from Cognitive Psychology (and Video Games)" (Morton Ann…
CLAST: CUDA implemented large-scale alignment search tool.
Yano, Masahiro; Mori, Hiroshi; Akiyama, Yutaka; Yamada, Takuji; Kurokawa, Ken
2014-12-11
Metagenomics is a powerful methodology to study microbial communities, but it is highly dependent on nucleotide sequence similarity searching against sequence databases. Metagenomic analyses with next-generation sequencing technologies produce enormous numbers of reads from microbial communities, and many reads are derived from microbes whose genomes have not yet been sequenced, limiting the usefulness of existing sequence similarity search tools. Therefore, there is a clear need for a sequence similarity search tool that can rapidly detect weak similarity in large datasets. We developed a tool, which we named CLAST (CUDA implemented large-scale alignment search tool), that enables analyses of millions of reads and thousands of reference genome sequences, and runs on NVIDIA Fermi architecture graphics processing units. CLAST has four main advantages over existing alignment tools. First, CLAST was capable of identifying sequence similarities ~80.8 times faster than BLAST and 9.6 times faster than BLAT. Second, CLAST executes global alignment as the default (local alignment is also an option), enabling CLAST to assign reads to taxonomic and functional groups based on evolutionarily distant nucleotide sequences with high accuracy. Third, CLAST does not need a preprocessed sequence database like Burrows-Wheeler Transform-based tools, and this enables CLAST to incorporate large, frequently updated sequence databases. Fourth, CLAST requires <2 GB of main memory, making it possible to run CLAST on a standard desktop computer or server node. CLAST achieved very high speed (similar to the Burrows-Wheeler Transform-based Bowtie 2 for long reads) and sensitivity (equal to BLAST, BLAT, and FR-HIT) without the need for extensive database preprocessing or a specialized computing platform. Our results demonstrate that CLAST has the potential to be one of the most powerful and realistic approaches to analyze the massive amount of sequence data from next-generation sequencing technologies.
A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
A Fast Multiple-Kernel Method With Applications to Detect Gene-Environment Interaction.
Marceau, Rachel; Lu, Wenbin; Holloway, Shannon; Sale, Michèle M; Worrall, Bradford B; Williams, Stephen R; Hsu, Fang-Chi; Tzeng, Jung-Ying
2015-09-01
Kernel machine (KM) models are a powerful tool for exploring associations between sets of genetic variants and complex traits. Although most KM methods use a single kernel function to assess the marginal effect of a variable set, KM analyses involving multiple kernels have become increasingly popular. Multikernel analysis allows researchers to study more complex problems, such as assessing gene-gene or gene-environment interactions, incorporating variance-component based methods for population substructure into rare-variant association testing, and assessing the conditional effects of a variable set adjusting for other variable sets. The KM framework is robust, powerful, and provides efficient dimension reduction for multifactor analyses, but requires the estimation of high dimensional nuisance parameters. Traditional estimation techniques, including regularization and the "expectation-maximization (EM)" algorithm, have a large computational cost and are not scalable to large sample sizes needed for rare variant analysis. Therefore, under the context of gene-environment interaction, we propose a computationally efficient and statistically rigorous "fastKM" algorithm for multikernel analysis that is based on a low-rank approximation to the nuisance effect kernel matrices. Our algorithm is applicable to various trait types (e.g., continuous, binary, and survival traits) and can be implemented using any existing single-kernel analysis software. Through extensive simulation studies, we show that our algorithm has similar performance to an EM-based KM approach for quantitative traits while running much faster. We also apply our method to the Vitamin Intervention for Stroke Prevention (VISP) clinical trial, examining gene-by-vitamin effects on recurrent stroke risk and gene-by-age effects on change in homocysteine level. © 2015 WILEY PERIODICALS, INC.
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
Dickson, E D; Hamby, D M
2014-03-01
The human health and environmental effects following a postulated accidental release of radioactive material to the environment have been a public and regulatory concern since the early development of nuclear technology. These postulated releases have been researched extensively to better understand the potential risks for accident mitigation and emergency planning purposes. The objective of this investigation is to provide an updated technical basis for contemporary building shielding factors for the US housing stock. Building shielding factors quantify the protection from ionising radiation provided by a certain building type. Much of the current data used to determine the quality of shielding around nuclear facilities and urban environments is based on simplistic point-kernel calculations for 1950s era suburbia and is no longer applicable to the densely populated urban environments realised today. To analyse a building's radiation shielding properties, the ideal approach would be to subject a variety of building types to various radioactive sources and measure the radiation levels in and around the building. While this is not entirely practicable, this research analyses the shielding effectiveness of ten structurally significant US housing-stock models (walls and roofs) important for shielding against ionising radiation. The experimental data are used to benchmark computational models to calculate the shielding effectiveness of various building configurations under investigation from two types of realistic environmental source terms. Various combinations of these ten shielding models can be used to develop full-scale computational housing-unit models for building shielding factor calculations representing 69.6 million housing units (61.3%) in the United States. Results produced in this investigation provide a comparison between theory and experiment behind building shielding factor methodology.
A Bitslice Implementation of Anderson's Attack on A5/1
NASA Astrophysics Data System (ADS)
Bulavintsev, Vadim; Semenov, Alexander; Zaikin, Oleg; Kochemazov, Stepan
2018-03-01
The A5/1 keystream generator is a part of Global System for Mobile Communications (GSM) protocol, employed in cellular networks all over the world. Its cryptographic resistance was extensively analyzed in dozens of papers. However, almost all corresponding methods either employ a specific hardware or require an extensive preprocessing stage and significant amounts of memory. In the present study, a bitslice variant of Anderson's Attack on A5/1 is implemented. It requires very little computer memory and no preprocessing. Moreover, the attack can be made even more efficient by harnessing the computing power of modern Graphics Processing Units (GPUs). As a result, using commonly available GPUs this method can quite efficiently recover the secret key using only 64 bits of keystream. To test the performance of the implementation, a volunteer computing project was launched. 10 instances of A5/1 cryptanalysis have been successfully solved in this project in a single week.
PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.
Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang
2017-07-26
Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.; Silva, Walter A.
2008-01-01
A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.
Modular space station phase B extension preliminary system design. Volume 5: configuration analyses
NASA Technical Reports Server (NTRS)
Stefan, A. J.; Goble, G. J.
1972-01-01
The initial and growth modular space station configurations are described, and the evolutionary steps arriving at the final configuration are outlined. Supporting tradeoff studies and analyses such as stress, radiation dosage, and micrometeoroid and thermal protection are included.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... scientific and technical analyses, OSHA requests that you disclose: (1) The nature of any financial... such as social security numbers and birthdates. If you submit scientific or technical studies or other... data and technical information submitted to the record. This request is consistent with Executive Order...
Factors of Role Conflict among Livestock Extension Professionals in Andhra Pradesh, India
ERIC Educational Resources Information Center
Sasidhar, P. V. K.; Rao, B. Sudhakar; Sreeramulu, Piedy
2008-01-01
To know the factors of role conflict among livestock extension professionals in Andhra Pradesh, India. Study was conducted following ex-post facto research design. Data were collected from 180 respondents through survey questionnaires. The data were subjected to multiple regression and path analyses to know the factors of role conflict.…
ERIC Educational Resources Information Center
Moumouni, Ismail M.; Vodouhe, Simplice D.; Streiffeler, Friedhelm
2009-01-01
This paper analyses the organizational, financial and technological incentives that service organizations used to motivate farmers to finance agricultural research and extension in Benin. Understanding the foundations and implications of these motivation systems is important for improving farmer financial participation in agricultural research and…
Materials constitutive models for nonlinear analysis of thermally cycled structures
NASA Technical Reports Server (NTRS)
Kaufman, A.; Hunt, L. E.
1982-01-01
Effects of inelastic materials models on computed stress-strain solutions for thermally loaded structures were studied by performing nonlinear (elastoplastic creep) and elastic structural analyses on a prismatic, double edge wedge specimen of IN 100 alloy that was subjected to thermal cycling in fluidized beds. Four incremental plasticity creep models (isotropic, kinematic, combined isotropic kinematic, and combined plus transient creep) were exercised for the problem by using the MARC nonlinear, finite element computer program. Maximum total strain ranges computed from the elastic and nonlinear analyses agreed within 5 percent. Mean cyclic stresses, inelastic strain ranges, and inelastic work were significantly affected by the choice of inelastic constitutive model. The computing time per cycle for the nonlinear analyses was more than five times that required for the elastic analysis.
A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.
Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K
2008-09-10
Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.
Yu, Yun; Degnan, James H.; Nakhleh, Luay
2012-01-01
Gene tree topologies have proven a powerful data source for various tasks, including species tree inference and species delimitation. Consequently, methods for computing probabilities of gene trees within species trees have been developed and widely used in probabilistic inference frameworks. All these methods assume an underlying multispecies coalescent model. However, when reticulate evolutionary events such as hybridization occur, these methods are inadequate, as they do not account for such events. Methods that account for both hybridization and deep coalescence in computing the probability of a gene tree topology currently exist for very limited cases. However, no such methods exist for general cases, owing primarily to the fact that it is currently unknown how to compute the probability of a gene tree topology within the branches of a phylogenetic network. Here we present a novel method for computing the probability of gene tree topologies on phylogenetic networks and demonstrate its application to the inference of hybridization in the presence of incomplete lineage sorting. We reanalyze a Saccharomyces species data set for which multiple analyses had converged on a species tree candidate. Using our method, though, we show that an evolutionary hypothesis involving hybridization in this group has better support than one of strict divergence. A similar reanalysis on a group of three Drosophila species shows that the data is consistent with hybridization. Further, using extensive simulation studies, we demonstrate the power of gene tree topologies at obtaining accurate estimates of branch lengths and hybridization probabilities of a given phylogenetic network. Finally, we discuss identifiability issues with detecting hybridization, particularly in cases that involve extinction or incomplete sampling of taxa. PMID:22536161
NASA Astrophysics Data System (ADS)
Shi, X.; Zhang, G.
2013-12-01
Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Technical Reports Server (NTRS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-01-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Astrophysics Data System (ADS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-08-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
BioQueue: a novel pipeline framework to accelerate bioinformatics analysis.
Yao, Li; Wang, Heming; Song, Yuanyuan; Sui, Guangchao
2017-10-15
With the rapid development of Next-Generation Sequencing, a large amount of data is now available for bioinformatics research. Meanwhile, the presence of many pipeline frameworks makes it possible to analyse these data. However, these tools concentrate mainly on their syntax and design paradigms, and dispatch jobs based on users' experience about the resources needed by the execution of a certain step in a protocol. As a result, it is difficult for these tools to maximize the potential of computing resources, and avoid errors caused by overload, such as memory overflow. Here, we have developed BioQueue, a web-based framework that contains a checkpoint before each step to automatically estimate the system resources (CPU, memory and disk) needed by the step and then dispatch jobs accordingly. BioQueue possesses a shell command-like syntax instead of implementing a new script language, which means most biologists without computer programming background can access the efficient queue system with ease. BioQueue is freely available at https://github.com/liyao001/BioQueue. The extensive documentation can be found at http://bioqueue.readthedocs.io. li_yao@outlook.com or gcsui@nefu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Kuiper, L M; Thijs, A; Smulders, Y M
2012-01-01
The advent of beamer projection of radiological images raises the issue of whether such projection compromises diagnostic accuracy. The purpose of this study was to evaluate whether beamer projection of chest X-rays is inferior to monitor display. We selected 53 chest X-rays with subtle abnormalities and 15 normal X-rays. The images were independently judged by a senior radiologist and a senior pulmonologist with a state-of-art computer monitor. We used their unanimous or consensus judgment as the reference test. Subsequently, four observers (one senior pulmonologist, one senior radiologist and one resident from each speciality) judged these X-rays on a standard clinical computer monitor and with beamer projection. We compared the number of correct results for each method. Overall, the sensitivity and specificity did not differ between monitor and beamer projection. Separate analyses in senior and junior examiners suggested that senior examiners had a moderate loss of diagnostic accuracy (8% lower sensitivity, pp<0.05, and 6% lower specificity, p=ns) associated with the use of beamer projection, whereas juniors showed similar performance on both imaging modalities. These initial data suggest that beamer projection may be associated with a small loss of diagnostic accuracy in specific subgroups of physicians. This finding illustrates the need for more extensive studies.
Nesbit, Steven M.; Elzinga, Michael; Herchenroder, Catherine; Serrano, Monika
2006-01-01
This paper discusses the inertia tensors of tennis rackets and their influence on the elbow swing torques in a forehand motion, the loadings transmitted to the elbow from central and eccentric impacts, and the racket acceleration responses from central and eccentric impacts. Inertia tensors of various rackets with similar mass and mass center location were determined by an inertia pendulum and were found to vary considerably in all three orthogonal directions. Tennis swing mechanics and impact analyses were performed using a computer model comprised of a full-body model of a human, a parametric model of the racket, and an impact function. The swing mechanics analysis of a forehand motion determined that inertia values had a moderate linear effect on the pronation-supination elbow torques required to twist the racket, and a minor effect on the flexion-extension and valgus-varus torques. The impact analysis found that mass center inertia values had a considerable effect on the transmitted torques for both longitudinal and latitudinal eccentric impacts and significantly affected all elbow torque components. Racket acceleration responses to central and eccentric impacts were measured experimentally and found to be notably sensitive to impact location and mass center inertia values. Key Points Tennis biomechanics. Racket inertia tensor. Impact analysis. Full-body computer model. PMID:24260004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somayaji, Anil B.; Amai, Wendy A.; Walther, Eleanor A.
This reports describes the successful extension of artificial immune systems from the domain of computer security to the domain of real time control systems for robotic vehicles. A biologically-inspired computer immune system was added to the control system of two different mobile robots. As an additional layer in a multi-layered approach, the immune system is complementary to traditional error detection and error handling techniques. This can be thought of as biologically-inspired defense in depth. We demonstrated an immune system can be added with very little application developer effort, resulting in little to no performance impact. The methods described here aremore » extensible to any system that processes a sequence of data through a software interface.« less
NASA Technical Reports Server (NTRS)
Thomson, F.
1972-01-01
The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.
[Thin-section computed tomography of the bronchi; 2. Right upper lobe and left upper division].
Matsuoka, Y; Ookubo, T; Ohtomo, K; Nishikawa, J; Kojima, K; Oyama, K; Yoshikawa, K; Iio, M
1990-02-01
Thin (2mm) section contiguous computed tomographic (CT) scans were obtained through the bronchi of the right upper lobe and the left upper division in 30 patients. All segmental bronchi were identified. The right subsegmental bronchi were identified in 100%, and the left subsegmental bronchi in 97%. The type of the orifice of the right bronchus was trifurcated (53%), the extension of B1 was apicoanterior (50%), and the size of B2b was equal to B3a (63%). The extension of the left B3 was subapicoanterior (38%), and the size of B1+2c was equal to B3a (62%).
Zhang, Wen-Bo; Mao, Chi; Liu, Xiao-Jing; Guo, Chuan-Bin; Yu, Guang-Yan; Peng, Xin
2015-10-01
Orbital floor defects after extensive maxillectomy can cause severe esthetic and functional deformities. Orbital floor reconstruction using the computer-assisted fabricated individual titanium mesh technique is a promising method. This study evaluated the application and clinical outcomes of this technique. This retrospective study included 10 patients with orbital floor defects after maxillectomy performed from 2012 through 2014. A 3-dimensional individual stereo model based on mirror images of the unaffected orbit was obtained to fabricate an anatomically adapted titanium mesh using computer-assisted design and manufacturing. The titanium mesh was inserted into the defect using computer navigation. The postoperative globe projection and orbital volume were measured and the incidence of postoperative complications was evaluated. The average postoperative globe projection was 15.91 ± 1.80 mm on the affected side and 16.24 ± 2.24 mm on the unaffected side (P = .505), and the average postoperative orbital volume was 26.01 ± 1.28 and 25.57 ± 1.89 mL, respectively (P = .312). The mean mesh depth was 25.11 ± 2.13 mm. The mean follow-up period was 23.4 ± 7.7 months (12 to 34 months). Of the 10 patients, 9 did not develop diplopia or a decrease in visual acuity and ocular motility. Titanium mesh exposure was not observed in any patient. All patients were satisfied with their postoperative facial symmetry. Orbital floor reconstruction after extensive maxillectomy with an individual titanium mesh fabricated using computer-assisted techniques can preserve globe projection and orbital volume, resulting in successful clinical outcomes. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
A Status Review of the Commercial Supersonic Technology (CST) Aeroservoelasticity (ASE) Project
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Sanetrik, Mark D.; Chwalowski, Pawel; Funk, Christy; Keller, Donald F.; Ringertz, Ulf
2016-01-01
An overview of recent progress regarding the computational aeroelastic and aeroservoelastic (ASE) analyses of a low-boom supersonic configuration is presented. The overview includes details of the computational models developed to date with a focus on unstructured CFD grids, computational aeroelastic analyses, sonic boom propagation studies that include static aeroelastic effects, and gust loads analyses. In addition, flutter boundaries using aeroelastic Reduced-Order Models (ROMs) are presented at various Mach numbers of interest. Details regarding a collaboration with the Royal Institute of Technology (KTH, Stockholm, Sweden) to design, fabricate, and test a full-span aeroelastic wind-tunnel model are also presented.
ERIC Educational Resources Information Center
Kline, Terence R.; Kneen, Harold; Barrett, Eric; Kleinschmidt, Andy; Doohan, Doug
2012-01-01
Differences in vegetable production methods utilized by American growers create distinct challenges for Extension personnel providing food safety training to producer groups. A program employing computers and projectors will not be accepted by an Amish group that does not accept modern technology. We have developed an outreach program that covers…
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.
Guidelines for Computing Longitudinal Dynamic Stability Characteristics of a Subsonic Transport
NASA Technical Reports Server (NTRS)
Thompson, Joseph R.; Frank, Neal T.; Murphy, Patrick C.
2010-01-01
A systematic study is presented to guide the selection of a numerical solution strategy for URANS computation of a subsonic transport configuration undergoing simulated forced oscillation about its pitch axis. Forced oscillation is central to the prevalent wind tunnel methodology for quantifying aircraft dynamic stability derivatives from force and moment coefficients, which is the ultimate goal for the computational simulations. Extensive computations are performed that lead in key insights of the critical numerical parameters affecting solution convergence. A preliminary linear harmonic analysis is included to demonstrate the potential of extracting dynamic stability derivatives from computational solutions.
Interdependency of the maximum range of flexion-extension of hand metacarpophalangeal joints.
Gracia-Ibáñez, V; Vergara, M; Sancho-Bru, J-L
2016-12-01
Mobility of the fingers metacarpophalangeal (MCP) joints depends on the posture of the adjacent ones. Current Biomechanical hand models consider fixed ranges of movement at joints, regardless of the posture, thus allowing for non-realistic postures, generating wrong results in reach studies and forward dynamic analyses. This study provides data for more realistic hand models. The maximum voluntary extension (MVE) and flexion (MVF) of different combinations of MCP joints were measured covering their range of motion. Dependency of the MVF and MVE on the posture of the adjacent MCP joints was confirmed and mathematical models obtained through regression analyses (RMSE 7.7°).
ERIC Educational Resources Information Center
Kilickaya, Ferit; Krajka, Jaroslaw
2012-01-01
Both teacher- and learner-made computer visuals are quite extensively reported in Computer-Assisted Language Learning literature, for instance, filming interviews, soap operas or mini-documentaries, creating storyboard projects, authoring podcasts and vodcasts, designing digital stories. Such student-made digital assets are used to present to…
ERIC Educational Resources Information Center
Rendiero, Jane; Linder, William W.
This report summarizes the results of a survey of 29 southern land-grant institutions which elicited information on microcomputer capabilities, programming efforts, and computer awareness education for formers, homemakers, community organizations, planning agencies, and other end users. Five topics were covered by the survey: (1) degree of…
ERIC Educational Resources Information Center
Guhlin, Miguel
2002-01-01
For technology to impact student achievement, teachers must be empowered via extensive staff development. This paper presents building-level technology initiatives (e.g., peer training, super substitutes, and computer clubs) and district- level initiatives (e.g., establish a district technology committee, allow teachers to take computers home over…
Using Computational Text Classification for Qualitative Research and Evaluation in Extension
ERIC Educational Resources Information Center
Smith, Justin G.; Tissing, Reid
2018-01-01
This article introduces a process for computational text classification that can be used in a variety of qualitative research and evaluation settings. The process leverages supervised machine learning based on an implementation of a multinomial Bayesian classifier. Applied to a community of inquiry framework, the algorithm was used to identify…
Design & Delivery of Training for a State-Wide Data Communication Network.
ERIC Educational Resources Information Center
Zacher, Candace M.
This report describes the process of development of training for agricultural research, teaching, and extension professionals in how to use the Fast Agricultural Communications Terminal (FACTS) computer network at Purdue University (Indiana), which is currently being upgraded in order to utilize the latest computer technology. The FACTS system is…
The Effect of CRT Screen Design on Learning.
ERIC Educational Resources Information Center
Grabinger, R. Scott; Albers, Starleen
Two computer assisted instruction programs tested the effects of plain and enhanced screen designs with or without information about those designs and task-type on time and learning. Subjects were 140 fourth grade students in Lincoln, Nebraska who had extensive prior experience with computers. The enhanced versions used headings, directive cues,…
ERIC Educational Resources Information Center
Halpern, Arthur M.
2006-01-01
The application of computational methods to the isomerization of hydrogen isocyanide to hydrogen cyanide, HNC-HCN is described. The logical extension to the exercise is presented to the isomerization of the methyl-substituted compounds, methylisocyanide and methylcyanide, Ch[subscript 3]NC-CH[subscript3]CN.
Probing End-User IT Security Practices--Through Homework
ERIC Educational Resources Information Center
Smith, Sean W.
2004-01-01
At Dartmouth College, the author teaches a course called "Security and Privacy." Its early position in the overall computer science curriculum means the course needs to be introductory, and the author can't assume the students possess an extensive computer science background. These constraints leave the author with a challenge: to construct…
Computer-Assisted Bilingual/Bicultural Multiskills Project, 1987-1988. OREA Report.
ERIC Educational Resources Information Center
Berney, Tomi D.; Carey, Cecilia
The Computer-Assisted Bilingual/Bicultural Multiskills Project completed its first year of an extension grant. The program used computerized and non-computerized instruction to help 109 native speakers of Haitian Creole/French and Spanish, most of whom were recent immigrants, develop English-as-a-Second-Language (ESL) native language, and content…
Classroom Talk and Computational Thinking
ERIC Educational Resources Information Center
Jenkins, Craig W.
2017-01-01
This paper is part of a wider action research project taking place at a secondary school in South Wales, UK. The overarching aim of the project is to examine the potential for aspects of literacy and computational thinking to be developed using extensible 'build your own block' programming activities. This paper examines classroom talk at an…
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 4 2010-10-01 2010-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 4 2012-10-01 2012-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 4 2011-10-01 2011-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 4 2013-10-01 2013-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 4 2014-10-01 2014-10-01 false Time. 1630.13 Section 1630.13 Public Welfare... § 1630.13 Time. (a) Computation. Time limits specified in this part shall be computed in accordance with... recipient's written request for good cause, grant an extension of time and shall so notify the recipient in...
ERIC Educational Resources Information Center
Jiang, L. Crystal; Bazarova, Natalie N.; Hancock, Jeffrey T.
2011-01-01
The present research investigated whether the attribution process through which people explain self-disclosures differs in text-based computer-mediated interactions versus face to face, and whether differences in causal attributions account for the increased intimacy frequently observed in mediated communication. In the experiment participants…
The potential of computer-aided process engineering (CAPE) tools to enable process engineers to improve the environmental performance of both their processes and across the life cycle (from cradle-to-grave) has long been proffered. However, this use of CAPE has not been fully ach...
A vectorized Lanczos eigensolver for high-performance computers
NASA Technical Reports Server (NTRS)
Bostic, Susan W.
1990-01-01
The computational strategies used to implement a Lanczos-based-method eigensolver on the latest generation of supercomputers are described. Several examples of structural vibration and buckling problems are presented that show the effects of using optimization techniques to increase the vectorization of the computational steps. The data storage and access schemes and the tools and strategies that best exploit the computer resources are presented. The method is implemented on the Convex C220, the Cray 2, and the Cray Y-MP computers. Results show that very good computation rates are achieved for the most computationally intensive steps of the Lanczos algorithm and that the Lanczos algorithm is many times faster than other methods extensively used in the past.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
NASA Astrophysics Data System (ADS)
Folguera, A.; Alasonati Tašárová, Z.; Götze, H.-J.; Rojas Vera, E.; Giménez, M.; Ramos, V. A.
2012-12-01
The Andean retroarc between 35° and 40°S is the locus of debate regarding its Pliocene to Quaternary tectonic setting. Retroarc volcanic eruptions since 6 Ma to the Present are, based on some hypotheses, associated with widespread extension. In these works, geological data point to the existence of normal faults affecting previous (Late Cretaceous to Miocene) contractional structures. In order to evaluate such interpretations we have collected data from various geological and geophysical studies and scales. Based on these data, an existing large-scale 3-D gravity model could be improved and used to investigate the lithospheric structure of this region. Moreover, using the gravity model, an attenuated crust could be localized and quantified throughout the retroarc area. Deep seismic data available from this region are limited to the forearc - arc area, while in general the retroarc zone lacks deep seismic constraints. The only deep seismic profile extending to the retroarc is a receiver function profile at 39°S, showing crustal attenuation. This observation correlates with the extensional activity recognized at the surface. When analysing the gravity field, positive residual anomalies are observed. They correlate with crustal attenuation at the areas of extension. Also, computed elastic thickness in the retroarc shows good correlation between the areas of crustal stretching and low flexural rigidity, explained by thermal processes. The present extensional deformation reflected in positive residual gravity anomalies points to the influence of reactivated Triassic rifting inherited from early phases of Pangea break-up. Finally, the present local uplift and consequent fluvial incision at the retroarc zone are explained by crustal stretching and not by crustal shortening, the common mechanism in Andean orogenesis.
Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993
1993-11-22
20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm
A survey of computer search service costs in the academic health sciences library.
Shirley, S
1978-01-01
The Norris Medical Library, University of Southern California, has recently completed an extensive survey of costs involved in the provision of computer search services beyond vendor charges for connect time and printing. In this survey costs for such items as terminal depreciation, repair contract, personnel time, and supplies are analyzed. Implications of this cost survey are discussed in relation to planning and price setting for computer search services. PMID:708953
Recent inner ear specialization for high-speed hunting in cheetahs.
Grohé, Camille; Lee, Beatrice; Flynn, John J
2018-02-02
The cheetah, Acinonyx jubatus, is the fastest living land mammal. Because of its specialized hunting strategy, this species evolved a series of specialized morphological and functional body features to increase its exceptional predatory performance during high-speed hunting. Using high-resolution X-ray computed micro-tomography (μCT), we provide the first analyses of the size and shape of the vestibular system of the inner ear in cats, an organ essential for maintaining body balance and adapting head posture and gaze direction during movement in most vertebrates. We demonstrate that the vestibular system of modern cheetahs is extremely different in shape and proportions relative to other cats analysed (12 modern and two fossil felid species), including a closely-related fossil cheetah species. These distinctive attributes (i.e., one of the greatest volumes of the vestibular system, dorsal extension of the anterior and posterior semicircular canals) correlate with a greater afferent sensitivity of the inner ear to head motions, facilitating postural and visual stability during high-speed prey pursuit and capture. These features are not present in the fossil cheetah A. pardinensis, that went extinct about 126,000 years ago, demonstrating that the unique and highly specialized inner ear of the sole living species of cheetah likely evolved extremely recently, possibly later than the middle Pleistocene.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thessen, Anne E.; Bunker, Daniel E.; Buttigieg, Pier Luigi
Understanding the interplay between environmental conditions and phenotypes is a fundamental goal of biology. Unfortunately, data that include observations on phenotype and environment are highly heterogeneous and thus difficult to find and integrate. One approach that is likely to improve the status quo involves the use of ontologies to standardize and link data about phenotypes and environments. Specifying and linking data through ontologies will allow researchers to increase the scope and flexibility of large-scale analyses aided by modern computing methods. Investments in this area would advance diverse fields such as ecology, phylogenetics, and conservation biology. While several biological ontologies aremore » well-developed, using them to link phenotypes and environments is rare because of gaps in ontological coverage and limits to interoperability among ontologies and disciplines. Lastly, in this manuscript, we present (1) use cases from diverse disciplines to illustrate questions that could be answered more efficiently using a robust linkage between phenotypes and environments, (2) two proof-of-concept analyses that show the value of linking phenotypes to environments in fishes and amphibians, and (3) two proposed example data models for linking phenotypes and environments using the extensible observation ontology (OBOE) and the Biological Collections Ontology (BCO); these provide a starting point for the development of a data model linking phenotypes and environments.« less
Emerging semantics to link phenotype and environment
Bunker, Daniel E.; Buttigieg, Pier Luigi; Cooper, Laurel D.; Dahdul, Wasila M.; Domisch, Sami; Franz, Nico M.; Jaiswal, Pankaj; Lawrence-Dill, Carolyn J.; Midford, Peter E.; Mungall, Christopher J.; Ramírez, Martín J.; Specht, Chelsea D.; Vogt, Lars; Vos, Rutger Aldo; Walls, Ramona L.; White, Jeffrey W.; Zhang, Guanyang; Deans, Andrew R.; Huala, Eva; Lewis, Suzanna E.; Mabee, Paula M.
2015-01-01
Understanding the interplay between environmental conditions and phenotypes is a fundamental goal of biology. Unfortunately, data that include observations on phenotype and environment are highly heterogeneous and thus difficult to find and integrate. One approach that is likely to improve the status quo involves the use of ontologies to standardize and link data about phenotypes and environments. Specifying and linking data through ontologies will allow researchers to increase the scope and flexibility of large-scale analyses aided by modern computing methods. Investments in this area would advance diverse fields such as ecology, phylogenetics, and conservation biology. While several biological ontologies are well-developed, using them to link phenotypes and environments is rare because of gaps in ontological coverage and limits to interoperability among ontologies and disciplines. In this manuscript, we present (1) use cases from diverse disciplines to illustrate questions that could be answered more efficiently using a robust linkage between phenotypes and environments, (2) two proof-of-concept analyses that show the value of linking phenotypes to environments in fishes and amphibians, and (3) two proposed example data models for linking phenotypes and environments using the extensible observation ontology (OBOE) and the Biological Collections Ontology (BCO); these provide a starting point for the development of a data model linking phenotypes and environments. PMID:26713234
Emerging semantics to link phenotype and environment.
Thessen, Anne E; Bunker, Daniel E; Buttigieg, Pier Luigi; Cooper, Laurel D; Dahdul, Wasila M; Domisch, Sami; Franz, Nico M; Jaiswal, Pankaj; Lawrence-Dill, Carolyn J; Midford, Peter E; Mungall, Christopher J; Ramírez, Martín J; Specht, Chelsea D; Vogt, Lars; Vos, Rutger Aldo; Walls, Ramona L; White, Jeffrey W; Zhang, Guanyang; Deans, Andrew R; Huala, Eva; Lewis, Suzanna E; Mabee, Paula M
2015-01-01
Understanding the interplay between environmental conditions and phenotypes is a fundamental goal of biology. Unfortunately, data that include observations on phenotype and environment are highly heterogeneous and thus difficult to find and integrate. One approach that is likely to improve the status quo involves the use of ontologies to standardize and link data about phenotypes and environments. Specifying and linking data through ontologies will allow researchers to increase the scope and flexibility of large-scale analyses aided by modern computing methods. Investments in this area would advance diverse fields such as ecology, phylogenetics, and conservation biology. While several biological ontologies are well-developed, using them to link phenotypes and environments is rare because of gaps in ontological coverage and limits to interoperability among ontologies and disciplines. In this manuscript, we present (1) use cases from diverse disciplines to illustrate questions that could be answered more efficiently using a robust linkage between phenotypes and environments, (2) two proof-of-concept analyses that show the value of linking phenotypes to environments in fishes and amphibians, and (3) two proposed example data models for linking phenotypes and environments using the extensible observation ontology (OBOE) and the Biological Collections Ontology (BCO); these provide a starting point for the development of a data model linking phenotypes and environments.
Emerging semantics to link phenotype and environment
Thessen, Anne E.; Bunker, Daniel E.; Buttigieg, Pier Luigi; ...
2015-12-14
Understanding the interplay between environmental conditions and phenotypes is a fundamental goal of biology. Unfortunately, data that include observations on phenotype and environment are highly heterogeneous and thus difficult to find and integrate. One approach that is likely to improve the status quo involves the use of ontologies to standardize and link data about phenotypes and environments. Specifying and linking data through ontologies will allow researchers to increase the scope and flexibility of large-scale analyses aided by modern computing methods. Investments in this area would advance diverse fields such as ecology, phylogenetics, and conservation biology. While several biological ontologies aremore » well-developed, using them to link phenotypes and environments is rare because of gaps in ontological coverage and limits to interoperability among ontologies and disciplines. Lastly, in this manuscript, we present (1) use cases from diverse disciplines to illustrate questions that could be answered more efficiently using a robust linkage between phenotypes and environments, (2) two proof-of-concept analyses that show the value of linking phenotypes to environments in fishes and amphibians, and (3) two proposed example data models for linking phenotypes and environments using the extensible observation ontology (OBOE) and the Biological Collections Ontology (BCO); these provide a starting point for the development of a data model linking phenotypes and environments.« less
Body shape analyses of large persons in South Korea.
Park, Woojin; Park, Sungjoon
2013-01-01
Despite the prevalence of obesity and overweight, anthropometric characteristics of large individuals have not been extensively studied. This study investigated body shapes of large persons (Broca index ≥ 20, BMI ≥ 25 or WHR>1.0) using stature-normalised body dimensions data from the latest South Korean anthropometric survey. For each sex, a factor analysis was performed on the anthropometric data set to identify the key factors that explain the shape variability; and then, a cluster analysis was conducted on the factor scores data to determine a set of representative body types. The body types were labelled in terms of their distinct shape characteristics and their relative frequencies were computed for each of the four age groups considered: the 10s, 20s-30s, 40s-50s and 60s. The study findings may facilitate creating artefacts that anthropometrically accommodate large individuals, developing digital human models of large persons and designing future ergonomics studies on largeness. This study investigated body shapes of large persons using anthropometric data from South Korea. For each sex, multivariate statistical analyses were conducted to identify the key factors of the body shape variability and determine the representative body types. The study findings may facilitate designing artefacts that anthropometrically accommodate large persons.
Processes Understanding of Decadal Climate Variability
NASA Astrophysics Data System (ADS)
Prömmel, Kerstin; Cubasch, Ulrich
2016-04-01
The realistic representation of decadal climate variability in the models is essential for the quality of decadal climate predictions. Therefore, the understanding of those processes leading to decadal climate variability needs to be improved. Several of these processes are already included in climate models but their importance has not yet completely been clarified. The simulation of other processes requires sometimes a higher resolution of the model or an extension by additional subsystems. This is addressed within one module of the German research program "MiKlip II - Decadal Climate Predictions" (http://www.fona-miklip.de/en/) with a focus on the following processes. Stratospheric processes and their impact on the troposphere are analysed regarding the climate response to aerosol perturbations caused by volcanic eruptions and the stratospheric decadal variability due to solar forcing, climate change and ozone recovery. To account for the interaction between changing ozone concentrations and climate a computationally efficient ozone chemistry module is developed and implemented in the MiKlip prediction system. The ocean variability and air-sea interaction are analysed with a special focus on the reduction of the North Atlantic cold bias. In addition, the predictability of the oceanic carbon uptake with a special emphasis on the underlying mechanism is investigated. This addresses a combination of physical, biological and chemical processes.
An Operational Computational Terminal Area PBL Prediction System
NASA Technical Reports Server (NTRS)
Lin, Yuh-Lang; Kaplan, Michael L.; Weglarz, Ronald P.; Hamilton, David W.
1997-01-01
There are two fundamental goals of this research project. The first and primary goal is to develop a prognostic system which could satisfy the operational weather prediction requirements of the meteorological subsystem within the Aircraft Vortex Spacing System (AVOSS). The secondary goal is to perform indepth diagnostic analyses of the meteorological conditions affecting the Memphis field experiment held during August 1995. These two goals are interdependent because a thorough understanding of the atmospheric dynamical processes which produced the unique meteorology during the Memphis deployment will help us design a prognostic system for the planetary boundary layer (PBL) which could be utilized to support the meteorological subsystem within AVOSS. The secondary goal occupied much of the first year of the research project. This involved extensive data acquisition and indepth analyses of a spectrum of atmospheric observational data sets. Concerning the primary goal, the first part of the four-stage prognostic system in support of AVOSS entitled: Terminal Area PBL Prediction System (TAPPS) was also formulated and tested in a research environment during 1996. We describe this system, and the three stages which are planned to follow. This first part of a software system designed to meet the primary goal of this research project is relatively inexpensive to implement and run operationally.
Comprehensive identification and analysis of human accelerated regulatory DNA
Gittelman, Rachel M.; Hun, Enna; Ay, Ferhat; Madeoy, Jennifer; Pennacchio, Len; Noble, William S.; Hawkins, R. David; Akey, Joshua M.
2015-01-01
It has long been hypothesized that changes in gene regulation have played an important role in human evolution, but regulatory DNA has been much more difficult to study compared with protein-coding regions. Recent large-scale studies have created genome-scale catalogs of DNase I hypersensitive sites (DHSs), which demark potentially functional regulatory DNA. To better define regulatory DNA that has been subject to human-specific adaptive evolution, we performed comprehensive evolutionary and population genetics analyses on over 18 million DHSs discovered in 130 cell types. We identified 524 DHSs that are conserved in nonhuman primates but accelerated in the human lineage (haDHS), and estimate that 70% of substitutions in haDHSs are attributable to positive selection. Through extensive computational and experimental analyses, we demonstrate that haDHSs are often active in brain or neuronal cell types; play an important role in regulating the expression of developmentally important genes, including many transcription factors such as SOX6, POU3F2, and HOX genes; and identify striking examples of adaptive regulatory evolution that may have contributed to human-specific phenotypes. More generally, our results reveal new insights into conserved and adaptive regulatory DNA in humans and refine the set of genomic substrates that distinguish humans from their closest living primate relatives. PMID:26104583
MOLSIM: A modular molecular simulation software
Jurij, Reščič
2015-01-01
The modular software MOLSIM for all‐atom molecular and coarse‐grained simulations is presented with focus on the underlying concepts used. The software possesses four unique features: (1) it is an integrated software for molecular dynamic, Monte Carlo, and Brownian dynamics simulations; (2) simulated objects are constructed in a hierarchical fashion representing atoms, rigid molecules and colloids, flexible chains, hierarchical polymers, and cross‐linked networks; (3) long‐range interactions involving charges, dipoles and/or anisotropic dipole polarizabilities are handled either with the standard Ewald sum, the smooth particle mesh Ewald sum, or the reaction‐field technique; (4) statistical uncertainties are provided for all calculated observables. In addition, MOLSIM supports various statistical ensembles, and several types of simulation cells and boundary conditions are available. Intermolecular interactions comprise tabulated pairwise potentials for speed and uniformity and many‐body interactions involve anisotropic polarizabilities. Intramolecular interactions include bond, angle, and crosslink potentials. A very large set of analyses of static and dynamic properties is provided. The capability of MOLSIM can be extended by user‐providing routines controlling, for example, start conditions, intermolecular potentials, and analyses. An extensive set of case studies in the field of soft matter is presented covering colloids, polymers, and crosslinked networks. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25994597
Neyman, Markov processes and survival analysis.
Yang, Grace
2013-07-01
J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.
Perpetration of teen dating violence in a networked society.
Korchmaros, Josephine D; Ybarra, Michele L; Langhinrichsen-Rohling, Jennifer; Boyd, Danah; Lenhart, Amanda
2013-08-01
Teen dating violence (TDV) is a serious form of youth violence that youth fairly commonly experience. Although youth extensively use computer-mediated communication (CMC), the epidemiology of CMC-based TDV is largely unknown. This study examined how perpetration of psychological TDV using CMC compares and relates to perpetration using longer-standing modes of communication (LSMC; e.g., face-to-face). Data from the national Growing up with Media study involving adolescents aged 14-19 collected from October 2010 to February 2011 and analyzed May 2012 are reported. Analyses focused on adolescents with a history of dating (n=615). Forty-six percent of youth daters had perpetrated psychological TDV. Of those who perpetrated in the past 12 months, 58% used only LSMC, 17% used only CMC, and 24% used both. Use of both CMC and LSMC was more likely among perpetrators who used CMC than among perpetrators who used LSMC. In addition, communication mode and type of psychological TDV behavior were separately related to frequency of perpetration. Finally, history of sexual intercourse was the only characteristic that discriminated between youth who perpetrated using different communication modes. Results suggest that perpetration of psychological TDV using CMC is prevalent and is an extension of perpetration using LSMC. Prevention should focus on preventing perpetration of LSMC-based TDV as doing so would prevent LSMC as well as CMC-based TDV.
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
SSME main combustion chamber and nozzle flowfield analysis
NASA Technical Reports Server (NTRS)
Farmer, R. C.; Wang, T. S.; Smith, S. D.; Prozan, R. J.
1986-01-01
An investigation is presented of the computational fluid dynamics (CFD) tools which would accurately analyze main combustion chamber and nozzle flow. The importance of combustion phenomena and local variations in mixture ratio are fully appreciated; however, the computational aspects of the gas dynamics involved were the sole issues addressed. The CFD analyses made are first compared with conventional nozzle analyses to determine the accuracy for steady flows, and then transient analyses are discussed.
ERIC Educational Resources Information Center
Carriere, Ronald A.; And Others
This report focuses on a set of supplemental analyses that were performed on portions of the Emergency School Aid Act (ESAA) evaluation data. The goal of these analyses was to explore additional relationships in the data that might help to inform program policy, to confirm and/or further explicate some of the findings reported earlier, and to put…
Smaller external notebook mice have different effects on posture and muscle activity.
Oude Hengel, Karen M; Houwink, Annemieke; Odell, Dan; van Dieën, Jaap H; Dennerlein, Jack T
2008-07-01
Extensive computer mouse use is an identified risk factor for computer work-related musculoskeletal disorders; however, notebook computer mouse designs of varying sizes have not been formally evaluated but may affect biomechanical risk factors. Thirty adults performed a set of mouse tasks with five notebook mice, ranging in length from 75 to 105 mm and in width from 35 to 65 mm, and a reference desktop mouse. An electro-magnetic motion analysis system measured index finger (metacarpophalangeal joint), wrist and forearm postures, and surface electromyography measured muscle activity of three extensor muscles in the forearm and the first dorsal interosseus. The smallest notebook mice were found to promote less neutral postures (up to 3.2 degrees higher metacarpophalangeal joint adduction; 6.5 degrees higher metacarpophalangeal joint flexion, 2.3 degrees higher wrist extension) and higher muscle activity (up to 4.1% of maximum voluntary contraction higher wrist extensor muscle activity). Participants with smaller hands had overall more non-neutral postures than participants with larger hands (up to 5.6 degrees higher wrist extension and 5.9 degrees higher pronation); while participants with larger hands were more influenced by the smallest notebook mice (up to 3.6 degrees higher wrist extension and 5.5% of maximum voluntary contraction higher wrist extensor values). Self-reported ratings showed that while participants preferred smaller mice for portability; larger mice scored higher on comfort and usability. The smallest notebook mice increased the intensity of biomechanical exposures. Longer term mouse use could enhance these differences, having a potential impact on the prevention of work-related musculoskeletal disorders.
The electromagnetic modeling of thin apertures using the finite-difference time-domain technique
NASA Technical Reports Server (NTRS)
Demarest, Kenneth R.
1987-01-01
A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.
NASA Astrophysics Data System (ADS)
Okanoya, Kazuo
2014-09-01
The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.
Computational Analyses of Offset Stream Nozzles for Noise Reduction
NASA Technical Reports Server (NTRS)
Dippold, Vance, III; Foster, Lancert; Wiese,Michael
2007-01-01
The Wind computational fluid dynamics code was used to perform a series of simulations on two offset stream nozzle concepts for jet noise reduction. The first concept used an S-duct to direct the secondary stream to the lower side of the nozzle. The second concept used vanes to turn the secondary flow downward. The analyses were completed in preparation of tests conducted in the NASA Glenn Research Center Aeroacoustic Propulsion Laboratory. The offset stream nozzles demonstrated good performance and reduced the amount of turbulence on the lower side of the jet plume. The computer analyses proved instrumental in guiding the development of the final test configurations and giving insight into the flow mechanics of offset stream nozzles. The computational predictions were compared with flowfield results from the jet rig testing and showed excellent agreement.
NASA Astrophysics Data System (ADS)
Wang, Yijiao; Huang, Peng; Xin, Zheng; Zeng, Lang; Liu, Xiaoyan; Du, Gang; Kang, Jinfeng
2014-01-01
In this work, three dimensional technology computer-aided design (TCAD) simulations are performed to investigate the impact of random discrete dopant (RDD) including extension induced fluctuation in 14 nm silicon-on-insulator (SOI) gate-source/drain (G-S/D) underlap fin field effect transistor (FinFET). To fully understand the RDD impact in extension, RDD effect is evaluated in channel and extension separately and together. The statistical variability of FinFET performance parameters including threshold voltage (Vth), subthreshold slope (SS), drain induced barrier lowering (DIBL), drive current (Ion), and leakage current (Ioff) are analyzed. The results indicate that RDD in extension can lead to substantial variability, especially for SS, DIBL, and Ion and should be taken into account together with that in channel to get an accurate estimation on RDF. Meanwhile, higher doping concentration of extension region is suggested from the perspective of overall variability control.
Besnier, Francois; Glover, Kevin A.
2013-01-01
This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012
NASA Technical Reports Server (NTRS)
Blakely, R. L.
1973-01-01
A G189A simulation of the shuttle orbiter EC/lSS was prepared and used to study payload support capabilities. Two master program libraries of the G189A computer program were prepared for the NASA/JSC computer system. Several new component subroutines were added to the G189A program library and many existing subroutines were revised to improve their capabilities. A number of special analyses were performed in support of a NASA/JSC shuttle orbiter EC/LSS payload support capability study.
Thermal Evolution of the North-Central Gulf Coast
NASA Astrophysics Data System (ADS)
Nunn, Jeffrey A.; Scardina, Allan D.; Pilger, Rex H., Jr.
1984-12-01
The subsidence history of the North Louisiana Salt Basin, determined from well data, indicates that the region underwent extension during rifting and has since passively subsided due to conductive cooling of the lithosphere. Timing of the rifting event is consistent with opening of the Gulf of Mexico during Late Triassic to Early Jurassic time. Crustal extension by a factor of 1.5 to 2 was computed from "tectonic" subsidence curves. However, data from the early subsidence history are insufficient to distinguish between uniform and nonuniform extension of the lithosphere. The magnitude of extension is in good agreement with total sediment and crustal thicknesses from seismic refraction data in the adjacent Central Mississippi Salt Basin. The temperature distribution within the sediments is calculated using a simple heat conduction model. Temperature and subsidence effects of thermal insulation by overlying sediments are included. The computed temperature distribution is in good agreement with bottom hole temperatures measured in deep wells. Temperature histories predicted for selected stratigraphic horizons within the North Louisiana Salt Basin suggest that thermal conditions have been favorable for hydrocarbon generation in the older stata. Results from a two-dimensional heat conduction model suggest that a probable cause for the early formation of the adjacent uplifts is lateral heat conduction from the basin. Rapid extension of the lithosphere underneath areas with horizontal dimensions of 50-100 km produces extremely rapid early subsidence due to lateral heat conduction. The moderate subsidence rate observed in the North Louisiana Salt Basin during the Jurassic and Early Cretaceous suggests slow extension over a long period of time.
NASA Technical Reports Server (NTRS)
Taylor, G. R.
1972-01-01
Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.
,
2006-01-01
GDA (Geologic Data Assistant) is an extension to ArcPad, a mobile mapping software program by Environmental Systems Research Institute (ESRI) designed to run on personal digital assistant (PDA) computers. GDA and ArcPad allow a PDA to replace the paper notebook and field map traditionally used for geologic mapping. GDA allows easy collection of field data.
Computer-Managed Instruction: Theory, Application, and Some Key Implementation Issues.
1984-03-01
who have endorsed computer technology but fail to adopt it . As one educational consultant claims: "Educators appear to have a deep-set skepticism toward...widespread use. i-1 II. BACKGROUND A. HISTORICAL PERSPECTIVE In the mid-1950’s, while still in its infancy, computer technology entered the world of education...to utilize the new technology , and to do it most.. extensively. Implementation of CMI in a standalone configuration using microcomputers has been
Computation of repetitions and regularities of biologically weighted sequences.
Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K
2006-01-01
Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.
Sensorimotor Assessment and Rehabilitative Apparatus
2017-10-01
vestibulo-ocular assessment without measuring eye movements per se. VON uses a head-mounted motion sensor, laptop computer with user...powered laptop computer with extensive processing algorithms. Frequent occlusion of the pupil by 2 eurosc t a o t t T t m I L f t o e n o a s h e t t s...The apparatus consists of a laptop computer , mirror galvanometer, back-projected laser target, data acquisition board, rate sensor, and motion-gain
Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs
NASA Astrophysics Data System (ADS)
RIngenburg, Michael F.
Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.
Mair, Grant; von Kummer, Rüdiger; Adami, Alessandro; White, Philip M.; Adams, Matthew E.; Yan, Bernard; Demchuk, Andrew M.; Farrall, Andrew J.; Sellar, Robin J.; Sakka, Eleni; Palmer, Jeb; Perry, David; Lindley, Richard I.; Sandercock, Peter A.G.
2017-01-01
Background and Purpose— Computed tomographic angiography and magnetic resonance angiography are used increasingly to assess arterial patency in patients with ischemic stroke. We determined which baseline angiography features predict response to intravenous thrombolytics in ischemic stroke using randomized controlled trial data. Methods— We analyzed angiograms from the IST-3 (Third International Stroke Trial), an international, multicenter, prospective, randomized controlled trial of intravenous alteplase. Readers, masked to clinical, treatment, and outcome data, assessed prerandomization computed tomographic angiography and magnetic resonance angiography for presence, extent, location, and completeness of obstruction and collaterals. We compared angiography findings to 6-month functional outcome (Oxford Handicap Scale) and tested for interactions with alteplase, using ordinal regression in adjusted analyses. We also meta-analyzed all available angiography data from other randomized controlled trials of intravenous thrombolytics. Results— In IST-3, 300 patients had prerandomization angiography (computed tomographic angiography=271 and magnetic resonance angiography=29). On multivariable analysis, more extensive angiographic obstruction and poor collaterals independently predicted poor outcome (P<0.01). We identified no significant interaction between angiography findings and alteplase effect on Oxford Handicap Scale (P≥0.075) in IST-3. In meta-analysis (5 trials of alteplase or desmoteplase, including IST-3, n=591), there was a significantly increased benefit of thrombolytics on outcome (odds ratio>1 indicates benefit) in patients with (odds ratio, 2.07; 95% confidence interval, 1.18–3.64; P=0.011) versus without (odds ratio, 0.88; 95% confidence interval, 0.58–1.35; P=0.566) arterial obstruction (P for interaction 0.017). Conclusions— Intravenous thrombolytics provide benefit to stroke patients with computed tomographic angiography or magnetic resonance angiography evidence of arterial obstruction, but the sample was underpowered to demonstrate significant treatment benefit or harm among patients with apparently patent arteries. Clinical Trial Registration— URL: http://www.isrctn.com. Unique identifier: ISRCTN25765518. PMID:28008093
Mair, Grant; von Kummer, Rüdiger; Adami, Alessandro; White, Philip M; Adams, Matthew E; Yan, Bernard; Demchuk, Andrew M; Farrall, Andrew J; Sellar, Robin J; Sakka, Eleni; Palmer, Jeb; Perry, David; Lindley, Richard I; Sandercock, Peter A G; Wardlaw, Joanna M
2017-02-01
Computed tomographic angiography and magnetic resonance angiography are used increasingly to assess arterial patency in patients with ischemic stroke. We determined which baseline angiography features predict response to intravenous thrombolytics in ischemic stroke using randomized controlled trial data. We analyzed angiograms from the IST-3 (Third International Stroke Trial), an international, multicenter, prospective, randomized controlled trial of intravenous alteplase. Readers, masked to clinical, treatment, and outcome data, assessed prerandomization computed tomographic angiography and magnetic resonance angiography for presence, extent, location, and completeness of obstruction and collaterals. We compared angiography findings to 6-month functional outcome (Oxford Handicap Scale) and tested for interactions with alteplase, using ordinal regression in adjusted analyses. We also meta-analyzed all available angiography data from other randomized controlled trials of intravenous thrombolytics. In IST-3, 300 patients had prerandomization angiography (computed tomographic angiography=271 and magnetic resonance angiography=29). On multivariable analysis, more extensive angiographic obstruction and poor collaterals independently predicted poor outcome (P<0.01). We identified no significant interaction between angiography findings and alteplase effect on Oxford Handicap Scale (P≥0.075) in IST-3. In meta-analysis (5 trials of alteplase or desmoteplase, including IST-3, n=591), there was a significantly increased benefit of thrombolytics on outcome (odds ratio>1 indicates benefit) in patients with (odds ratio, 2.07; 95% confidence interval, 1.18-3.64; P=0.011) versus without (odds ratio, 0.88; 95% confidence interval, 0.58-1.35; P=0.566) arterial obstruction (P for interaction 0.017). Intravenous thrombolytics provide benefit to stroke patients with computed tomographic angiography or magnetic resonance angiography evidence of arterial obstruction, but the sample was underpowered to demonstrate significant treatment benefit or harm among patients with apparently patent arteries. URL: http://www.isrctn.com. Unique identifier: ISRCTN25765518. © 2016 The Authors.
Modular space station phase B extension, preliminary system design. Volume 4: Subsystems analyses
NASA Technical Reports Server (NTRS)
Antell, R. W.
1972-01-01
The subsystems tradeoffs, analyses, and preliminary design results are summarized. Analyses were made of the structural and mechanical, environmental control and life support, electrical power, guidance and control, reaction control, information, and crew habitability subsystems. For each subsystem a summary description is presented including subsystem requirements, subsystem description, and subsystem characteristics definition (physical, performance, and interface). The major preliminary design data and tradeoffs or analyses are described in detail at each of the assembly levels.
NASA Computational Case Study SAR Data Processing: Ground-Range Projection
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess; Rincon, Rafael
2013-01-01
Radar technology is used extensively by NASA for remote sensing of the Earth and other Planetary bodies. In this case study, we learn about different computational concepts for processing radar data. In particular, we learn how to correct a slanted radar image by projecting it on the surface that was sensed by a radar instrument.
Flight instrument and telemetry response and its inversion
NASA Technical Reports Server (NTRS)
Weinberger, M. R.
1971-01-01
Mathematical models of rate gyros, servo accelerometers, pressure transducers, and telemetry systems were derived and their parameters were obtained from laboratory tests. Analog computer simulations were used extensively for verification of the validity for fast and large input signals. An optimal inversion method was derived to reconstruct input signals from noisy output signals and a computer program was prepared.
NASA Technical Reports Server (NTRS)
1979-01-01
A description and listing is presented of two computer programs: Hybrid Vehicle Design Program (HYVELD) and Hybrid Vehicle Simulation Program (HYVEC). Both of the programs are modifications and extensions of similar programs developed as part of the Electric and Hybrid Vehicle System Research and Development Project.
Survivability Extensions for Dynamic Ultralog Environments
2004-12-07
8217 on line number 2018 doDo: Wrong number of tokens for ’Do’ on line number 2049 doDo: Wrong number of tokens for ’Do’ on line number 2069 doDo...discuss survivability as defined in the " bible of computational complexity", namely, the book "Computers and Intractability, a Guide to the Theory of
Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.
ERIC Educational Resources Information Center
Oulman, Charles S.; Lee, Motoko Y.
Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…